Oct 04 02:40:19 crc systemd[1]: Starting Kubernetes Kubelet... Oct 04 02:40:19 crc restorecon[4744]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 02:40:19 crc restorecon[4744]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 02:40:19 crc restorecon[4744]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 04 02:40:20 crc kubenswrapper[4964]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 04 02:40:20 crc kubenswrapper[4964]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 04 02:40:20 crc kubenswrapper[4964]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 04 02:40:20 crc kubenswrapper[4964]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 04 02:40:20 crc kubenswrapper[4964]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 04 02:40:20 crc kubenswrapper[4964]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.577400 4964 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586050 4964 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586093 4964 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586105 4964 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586116 4964 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586125 4964 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586136 4964 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586145 4964 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586154 4964 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586163 4964 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586170 4964 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586178 4964 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586186 4964 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586194 4964 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586201 4964 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586209 4964 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586218 4964 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586225 4964 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586233 4964 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586240 4964 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586248 4964 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586256 4964 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586264 4964 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586274 4964 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586284 4964 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586292 4964 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586300 4964 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586307 4964 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586315 4964 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586322 4964 feature_gate.go:330] unrecognized feature gate: Example Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586330 4964 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586346 4964 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586354 4964 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586362 4964 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586370 4964 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586377 4964 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586385 4964 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586395 4964 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586403 4964 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586413 4964 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586421 4964 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586429 4964 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586438 4964 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586446 4964 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586454 4964 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586462 4964 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586469 4964 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586477 4964 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586484 4964 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586492 4964 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586499 4964 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586507 4964 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586514 4964 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586522 4964 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586529 4964 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586537 4964 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586547 4964 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586556 4964 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586565 4964 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586572 4964 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586580 4964 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586587 4964 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586594 4964 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586603 4964 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586611 4964 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586649 4964 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586657 4964 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586664 4964 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586672 4964 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586682 4964 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586692 4964 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.586704 4964 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586844 4964 flags.go:64] FLAG: --address="0.0.0.0" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586862 4964 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586877 4964 flags.go:64] FLAG: --anonymous-auth="true" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586888 4964 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586900 4964 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586910 4964 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586922 4964 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586935 4964 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586945 4964 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586954 4964 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586963 4964 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586975 4964 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586984 4964 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.586993 4964 flags.go:64] FLAG: --cgroup-root="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587001 4964 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587010 4964 flags.go:64] FLAG: --client-ca-file="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587019 4964 flags.go:64] FLAG: --cloud-config="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587028 4964 flags.go:64] FLAG: --cloud-provider="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587040 4964 flags.go:64] FLAG: --cluster-dns="[]" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587051 4964 flags.go:64] FLAG: --cluster-domain="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587060 4964 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587069 4964 flags.go:64] FLAG: --config-dir="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587078 4964 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587088 4964 flags.go:64] FLAG: --container-log-max-files="5" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587100 4964 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587109 4964 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587118 4964 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587127 4964 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587137 4964 flags.go:64] FLAG: --contention-profiling="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587146 4964 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587155 4964 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587164 4964 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587173 4964 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587184 4964 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587193 4964 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587202 4964 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587211 4964 flags.go:64] FLAG: --enable-load-reader="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587221 4964 flags.go:64] FLAG: --enable-server="true" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587230 4964 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587241 4964 flags.go:64] FLAG: --event-burst="100" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587250 4964 flags.go:64] FLAG: --event-qps="50" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587259 4964 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587268 4964 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587277 4964 flags.go:64] FLAG: --eviction-hard="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587288 4964 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587297 4964 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587305 4964 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587316 4964 flags.go:64] FLAG: --eviction-soft="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587324 4964 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587333 4964 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587343 4964 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587353 4964 flags.go:64] FLAG: --experimental-mounter-path="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587362 4964 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587372 4964 flags.go:64] FLAG: --fail-swap-on="true" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587381 4964 flags.go:64] FLAG: --feature-gates="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587398 4964 flags.go:64] FLAG: --file-check-frequency="20s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587407 4964 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587416 4964 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587427 4964 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587436 4964 flags.go:64] FLAG: --healthz-port="10248" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587446 4964 flags.go:64] FLAG: --help="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587455 4964 flags.go:64] FLAG: --hostname-override="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587499 4964 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587508 4964 flags.go:64] FLAG: --http-check-frequency="20s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587518 4964 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587527 4964 flags.go:64] FLAG: --image-credential-provider-config="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587536 4964 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587544 4964 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587553 4964 flags.go:64] FLAG: --image-service-endpoint="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587562 4964 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587571 4964 flags.go:64] FLAG: --kube-api-burst="100" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587582 4964 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587592 4964 flags.go:64] FLAG: --kube-api-qps="50" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587600 4964 flags.go:64] FLAG: --kube-reserved="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587610 4964 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587644 4964 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587654 4964 flags.go:64] FLAG: --kubelet-cgroups="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587663 4964 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587673 4964 flags.go:64] FLAG: --lock-file="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587681 4964 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587691 4964 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587701 4964 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587716 4964 flags.go:64] FLAG: --log-json-split-stream="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587726 4964 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587736 4964 flags.go:64] FLAG: --log-text-split-stream="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587745 4964 flags.go:64] FLAG: --logging-format="text" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587754 4964 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587763 4964 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587772 4964 flags.go:64] FLAG: --manifest-url="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587781 4964 flags.go:64] FLAG: --manifest-url-header="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587794 4964 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587803 4964 flags.go:64] FLAG: --max-open-files="1000000" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587814 4964 flags.go:64] FLAG: --max-pods="110" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587824 4964 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587833 4964 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587842 4964 flags.go:64] FLAG: --memory-manager-policy="None" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587852 4964 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587861 4964 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587870 4964 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587879 4964 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587899 4964 flags.go:64] FLAG: --node-status-max-images="50" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587908 4964 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587917 4964 flags.go:64] FLAG: --oom-score-adj="-999" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587926 4964 flags.go:64] FLAG: --pod-cidr="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587935 4964 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587951 4964 flags.go:64] FLAG: --pod-manifest-path="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587960 4964 flags.go:64] FLAG: --pod-max-pids="-1" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587969 4964 flags.go:64] FLAG: --pods-per-core="0" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587978 4964 flags.go:64] FLAG: --port="10250" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587988 4964 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.587997 4964 flags.go:64] FLAG: --provider-id="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588006 4964 flags.go:64] FLAG: --qos-reserved="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588015 4964 flags.go:64] FLAG: --read-only-port="10255" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588024 4964 flags.go:64] FLAG: --register-node="true" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588034 4964 flags.go:64] FLAG: --register-schedulable="true" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588043 4964 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588058 4964 flags.go:64] FLAG: --registry-burst="10" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588067 4964 flags.go:64] FLAG: --registry-qps="5" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588076 4964 flags.go:64] FLAG: --reserved-cpus="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588086 4964 flags.go:64] FLAG: --reserved-memory="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588097 4964 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588107 4964 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588116 4964 flags.go:64] FLAG: --rotate-certificates="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588125 4964 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588134 4964 flags.go:64] FLAG: --runonce="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588143 4964 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588153 4964 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588162 4964 flags.go:64] FLAG: --seccomp-default="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588171 4964 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588179 4964 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588189 4964 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588199 4964 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588208 4964 flags.go:64] FLAG: --storage-driver-password="root" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588217 4964 flags.go:64] FLAG: --storage-driver-secure="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588226 4964 flags.go:64] FLAG: --storage-driver-table="stats" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588234 4964 flags.go:64] FLAG: --storage-driver-user="root" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588243 4964 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588254 4964 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588269 4964 flags.go:64] FLAG: --system-cgroups="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588278 4964 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588292 4964 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588301 4964 flags.go:64] FLAG: --tls-cert-file="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588310 4964 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588322 4964 flags.go:64] FLAG: --tls-min-version="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588331 4964 flags.go:64] FLAG: --tls-private-key-file="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588339 4964 flags.go:64] FLAG: --topology-manager-policy="none" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588349 4964 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588358 4964 flags.go:64] FLAG: --topology-manager-scope="container" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588367 4964 flags.go:64] FLAG: --v="2" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588380 4964 flags.go:64] FLAG: --version="false" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588391 4964 flags.go:64] FLAG: --vmodule="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588403 4964 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.588413 4964 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588646 4964 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588657 4964 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588667 4964 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588675 4964 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588683 4964 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588691 4964 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588699 4964 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588707 4964 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588715 4964 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588723 4964 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588732 4964 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588741 4964 feature_gate.go:330] unrecognized feature gate: Example Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588748 4964 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588756 4964 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588764 4964 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588772 4964 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588780 4964 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588792 4964 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588800 4964 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588807 4964 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588818 4964 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588827 4964 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588836 4964 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588845 4964 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588853 4964 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588868 4964 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588876 4964 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588884 4964 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588892 4964 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588899 4964 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588907 4964 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588915 4964 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588923 4964 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588931 4964 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588938 4964 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588946 4964 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588953 4964 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588961 4964 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588969 4964 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588977 4964 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588985 4964 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.588995 4964 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589005 4964 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589013 4964 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589022 4964 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589031 4964 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589040 4964 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589049 4964 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589057 4964 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589068 4964 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589079 4964 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589089 4964 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589100 4964 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589109 4964 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589118 4964 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589126 4964 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589134 4964 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589145 4964 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589153 4964 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589160 4964 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589169 4964 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589176 4964 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589184 4964 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589192 4964 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589199 4964 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589206 4964 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589214 4964 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589222 4964 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589229 4964 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589237 4964 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.589245 4964 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.589257 4964 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.601711 4964 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.601764 4964 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.601896 4964 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.601915 4964 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.601925 4964 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.601936 4964 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.601944 4964 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.601953 4964 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.601961 4964 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.601970 4964 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.601979 4964 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.601987 4964 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.601995 4964 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602003 4964 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602011 4964 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602020 4964 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602028 4964 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602036 4964 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602044 4964 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602053 4964 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602063 4964 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602071 4964 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602079 4964 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602090 4964 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602101 4964 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602112 4964 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602122 4964 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602132 4964 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602143 4964 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602153 4964 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602163 4964 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602177 4964 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602192 4964 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602204 4964 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602213 4964 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602223 4964 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602236 4964 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602246 4964 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602257 4964 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602267 4964 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602277 4964 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602288 4964 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602297 4964 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602308 4964 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602318 4964 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602328 4964 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602336 4964 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602343 4964 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602351 4964 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602359 4964 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602366 4964 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602375 4964 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602382 4964 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602392 4964 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602399 4964 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602407 4964 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602415 4964 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602423 4964 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602431 4964 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602438 4964 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602449 4964 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602459 4964 feature_gate.go:330] unrecognized feature gate: Example Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602468 4964 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602476 4964 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602483 4964 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602491 4964 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602501 4964 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602511 4964 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602519 4964 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602526 4964 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602534 4964 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602542 4964 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602553 4964 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.602567 4964 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602847 4964 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602865 4964 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602875 4964 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602886 4964 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602895 4964 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602904 4964 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602912 4964 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602921 4964 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602930 4964 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602939 4964 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602950 4964 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602961 4964 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602972 4964 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602984 4964 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.602994 4964 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603003 4964 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603012 4964 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603020 4964 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603029 4964 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603037 4964 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603046 4964 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603054 4964 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603061 4964 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603070 4964 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603079 4964 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603087 4964 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603097 4964 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603106 4964 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603115 4964 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603124 4964 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603133 4964 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603141 4964 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603150 4964 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603158 4964 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603167 4964 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603176 4964 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603184 4964 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603192 4964 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603200 4964 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603208 4964 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603216 4964 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603224 4964 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603232 4964 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603239 4964 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603247 4964 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603256 4964 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603263 4964 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603272 4964 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603280 4964 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603287 4964 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603295 4964 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603303 4964 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603311 4964 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603319 4964 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603330 4964 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603340 4964 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603349 4964 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603357 4964 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603367 4964 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603376 4964 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603384 4964 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603392 4964 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603400 4964 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603408 4964 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603416 4964 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603424 4964 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603432 4964 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603439 4964 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603447 4964 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603482 4964 feature_gate.go:330] unrecognized feature gate: Example Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.603495 4964 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.603514 4964 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.605893 4964 server.go:940] "Client rotation is on, will bootstrap in background" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.614161 4964 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.614303 4964 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.616711 4964 server.go:997] "Starting client certificate rotation" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.616760 4964 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.617798 4964 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-02 13:22:39.318986656 +0000 UTC Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.618016 4964 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1426h42m18.700977353s for next certificate rotation Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.645091 4964 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.650270 4964 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.668471 4964 log.go:25] "Validated CRI v1 runtime API" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.706845 4964 log.go:25] "Validated CRI v1 image API" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.709169 4964 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.714159 4964 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-04-02-35-57-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.714205 4964 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.749461 4964 manager.go:217] Machine: {Timestamp:2025-10-04 02:40:20.744867732 +0000 UTC m=+0.641826460 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b0d50052-71be-478d-b81f-25c1a6e2025f BootID:5f4745f6-8127-4980-be1d-1af4770a22e1 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:92:f4:0e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:92:f4:0e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e2:6a:f7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:90:d8:12 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c0:d0:73 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:8d:5e:4b Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:87:1a:81 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ce:0b:c8:44:a4:ac Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:7a:76:07:2e:1e:27 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.749916 4964 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.750151 4964 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.752417 4964 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.752753 4964 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.752804 4964 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.753173 4964 topology_manager.go:138] "Creating topology manager with none policy" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.753191 4964 container_manager_linux.go:303] "Creating device plugin manager" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.753950 4964 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.754001 4964 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.755352 4964 state_mem.go:36] "Initialized new in-memory state store" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.755511 4964 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.758991 4964 kubelet.go:418] "Attempting to sync node with API server" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.759024 4964 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.759075 4964 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.759095 4964 kubelet.go:324] "Adding apiserver pod source" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.759113 4964 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.765779 4964 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.765796 4964 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:20 crc kubenswrapper[4964]: E1004 02:40:20.765916 4964 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 04 02:40:20 crc kubenswrapper[4964]: E1004 02:40:20.765923 4964 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.766027 4964 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.767061 4964 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.769733 4964 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771415 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771457 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771475 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771498 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771520 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771535 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771548 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771569 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771584 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771598 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771670 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771685 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.771731 4964 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.772259 4964 server.go:1280] "Started kubelet" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.773311 4964 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.773437 4964 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.773581 4964 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 04 02:40:20 crc systemd[1]: Started Kubernetes Kubelet. Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.774576 4964 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.776608 4964 server.go:460] "Adding debug handlers to kubelet server" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.781641 4964 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.781696 4964 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.781840 4964 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 02:19:37.465314228 +0000 UTC Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.781916 4964 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2111h39m16.683403069s for next certificate rotation Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.781928 4964 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.781950 4964 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 04 02:40:20 crc kubenswrapper[4964]: E1004 02:40:20.781982 4964 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.782029 4964 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 04 02:40:20 crc kubenswrapper[4964]: E1004 02:40:20.782310 4964 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.782524 4964 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:20 crc kubenswrapper[4964]: E1004 02:40:20.782658 4964 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.783806 4964 factory.go:55] Registering systemd factory Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.786453 4964 factory.go:221] Registration of the systemd container factory successfully Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.786914 4964 factory.go:153] Registering CRI-O factory Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.786957 4964 factory.go:221] Registration of the crio container factory successfully Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.787055 4964 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.787086 4964 factory.go:103] Registering Raw factory Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.787109 4964 manager.go:1196] Started watching for new ooms in manager Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.788087 4964 manager.go:319] Starting recovery of all containers Oct 04 02:40:20 crc kubenswrapper[4964]: E1004 02:40:20.786743 4964 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186b296b6c38f28f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-04 02:40:20.772221583 +0000 UTC m=+0.669180261,LastTimestamp:2025-10-04 02:40:20.772221583 +0000 UTC m=+0.669180261,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797489 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797578 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797603 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797671 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797692 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797709 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797730 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797777 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797804 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797824 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797840 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797857 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797874 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797893 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797919 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797937 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797955 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797974 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.797992 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798009 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798026 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798041 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798085 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798104 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798119 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798136 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798158 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798177 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798195 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798210 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798226 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798242 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798260 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798278 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798296 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798312 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798328 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798344 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798360 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798378 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798393 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798408 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798424 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798441 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798457 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798474 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798491 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798510 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798527 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798545 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798563 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798582 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798604 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798649 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798667 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798687 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798725 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798744 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798761 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798779 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798797 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798812 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798827 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798843 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798860 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798877 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798893 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798911 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798927 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798945 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798961 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798977 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.798994 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799012 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799028 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799044 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799059 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799604 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799773 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799810 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799823 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799843 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799856 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799900 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799911 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799928 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799940 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.799993 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800013 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800030 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800082 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800102 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800142 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800169 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800185 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800232 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800248 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800269 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800313 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800328 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800342 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800362 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800399 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800412 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800473 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800496 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800524 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800570 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800584 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800601 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800655 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800675 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800694 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800762 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800780 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800848 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800867 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800915 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800931 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800945 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.800957 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801000 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801012 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801023 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801035 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801048 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801081 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801094 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801106 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801122 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801135 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801168 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801182 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801194 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801207 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801240 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801255 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801267 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801279 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801290 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801324 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801337 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801348 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801360 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801372 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801407 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801420 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801431 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801443 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801456 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801467 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801500 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801512 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801524 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801535 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801547 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801578 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801588 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801600 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801675 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801691 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801707 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801719 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801731 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801767 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801781 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801794 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801806 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801840 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801853 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801865 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801878 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801889 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.801923 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.804438 4964 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805159 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805206 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805243 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805303 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805338 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805365 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805394 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805425 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805454 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805475 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805496 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805517 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805539 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805560 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805579 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805598 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805658 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805685 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805708 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805727 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805748 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805771 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805790 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805809 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805828 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805847 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805866 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805959 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.805984 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.806007 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.806026 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.806046 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.806066 4964 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.806085 4964 reconstruct.go:97] "Volume reconstruction finished" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.806098 4964 reconciler.go:26] "Reconciler: start to sync state" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.817044 4964 manager.go:324] Recovery completed Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.831820 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.836375 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.836418 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.836436 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.839048 4964 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.839071 4964 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.839093 4964 state_mem.go:36] "Initialized new in-memory state store" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.840351 4964 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.843934 4964 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.843999 4964 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.844033 4964 kubelet.go:2335] "Starting kubelet main sync loop" Oct 04 02:40:20 crc kubenswrapper[4964]: E1004 02:40:20.844173 4964 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 04 02:40:20 crc kubenswrapper[4964]: W1004 02:40:20.848760 4964 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:20 crc kubenswrapper[4964]: E1004 02:40:20.848847 4964 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.861941 4964 policy_none.go:49] "None policy: Start" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.863055 4964 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.863085 4964 state_mem.go:35] "Initializing new in-memory state store" Oct 04 02:40:20 crc kubenswrapper[4964]: E1004 02:40:20.882857 4964 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.928372 4964 manager.go:334] "Starting Device Plugin manager" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.928548 4964 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.928646 4964 server.go:79] "Starting device plugin registration server" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.929109 4964 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.929199 4964 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.929392 4964 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.929527 4964 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.929542 4964 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 04 02:40:20 crc kubenswrapper[4964]: E1004 02:40:20.937532 4964 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.944573 4964 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.944710 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.945851 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.945903 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.945921 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.946124 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.946448 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.946512 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.947946 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.947991 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.948012 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.948141 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.948362 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.948496 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.949208 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.949253 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.949271 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.949419 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.949569 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.949647 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.950494 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.950533 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.950551 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.951082 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.951132 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.951154 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.951299 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.951339 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.951357 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.952327 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.952369 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.952386 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.952641 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.952789 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.952839 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.954377 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.954442 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.954465 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.954749 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.954789 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.954807 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.955505 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.955586 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.957106 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.957144 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:20 crc kubenswrapper[4964]: I1004 02:40:20.957160 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:20 crc kubenswrapper[4964]: E1004 02:40:20.983398 4964 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.008849 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.008900 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.008941 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.008971 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.009005 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.009177 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.009209 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.009259 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.009331 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.009433 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.009505 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.009555 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.009593 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.009813 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.009879 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.030119 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.032081 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.032131 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.032167 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.032201 4964 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 02:40:21 crc kubenswrapper[4964]: E1004 02:40:21.032951 4964 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.111829 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112041 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112197 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112154 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112257 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112284 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112306 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112328 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112349 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112370 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112390 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112418 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112453 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112441 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112486 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112507 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112483 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112419 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112556 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112513 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112567 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112574 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112448 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112573 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112590 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112834 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112605 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112654 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112930 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.112606 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.234048 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.236158 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.236198 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.236210 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.236240 4964 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 02:40:21 crc kubenswrapper[4964]: E1004 02:40:21.236716 4964 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 04 02:40:21 crc kubenswrapper[4964]: E1004 02:40:21.292976 4964 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186b296b6c38f28f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-04 02:40:20.772221583 +0000 UTC m=+0.669180261,LastTimestamp:2025-10-04 02:40:20.772221583 +0000 UTC m=+0.669180261,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.303339 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.331503 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.362697 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: W1004 02:40:21.366536 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-07f344eb0940889b4276f01455cbe5de7f1264b05764503421cda343779cad4c WatchSource:0}: Error finding container 07f344eb0940889b4276f01455cbe5de7f1264b05764503421cda343779cad4c: Status 404 returned error can't find the container with id 07f344eb0940889b4276f01455cbe5de7f1264b05764503421cda343779cad4c Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.376219 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: W1004 02:40:21.380457 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b47f05cb08b6faa754dbc9228ab89a5d33b11b4bce3a521604a43fec1d4e353f WatchSource:0}: Error finding container b47f05cb08b6faa754dbc9228ab89a5d33b11b4bce3a521604a43fec1d4e353f: Status 404 returned error can't find the container with id b47f05cb08b6faa754dbc9228ab89a5d33b11b4bce3a521604a43fec1d4e353f Oct 04 02:40:21 crc kubenswrapper[4964]: E1004 02:40:21.384244 4964 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.385592 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 04 02:40:21 crc kubenswrapper[4964]: W1004 02:40:21.400496 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-425466b20966d1e97d85fc610bacd3a2c0a6d08b5288b9f23bdd9c1b0e920b44 WatchSource:0}: Error finding container 425466b20966d1e97d85fc610bacd3a2c0a6d08b5288b9f23bdd9c1b0e920b44: Status 404 returned error can't find the container with id 425466b20966d1e97d85fc610bacd3a2c0a6d08b5288b9f23bdd9c1b0e920b44 Oct 04 02:40:21 crc kubenswrapper[4964]: W1004 02:40:21.411357 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-00804b9d3e2000806f7744f6fc88779e95497b93e1e50222c1a91095372cf77c WatchSource:0}: Error finding container 00804b9d3e2000806f7744f6fc88779e95497b93e1e50222c1a91095372cf77c: Status 404 returned error can't find the container with id 00804b9d3e2000806f7744f6fc88779e95497b93e1e50222c1a91095372cf77c Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.637556 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.639841 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.639902 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.639920 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.639957 4964 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 02:40:21 crc kubenswrapper[4964]: E1004 02:40:21.640470 4964 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 04 02:40:21 crc kubenswrapper[4964]: W1004 02:40:21.745959 4964 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:21 crc kubenswrapper[4964]: E1004 02:40:21.746080 4964 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.774846 4964 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.848997 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b47f05cb08b6faa754dbc9228ab89a5d33b11b4bce3a521604a43fec1d4e353f"} Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.852028 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"07f344eb0940889b4276f01455cbe5de7f1264b05764503421cda343779cad4c"} Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.853498 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d6cb34ded40373c89e944baeacf99e292c9cfb778408f9c1747b19b79654f733"} Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.854234 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"00804b9d3e2000806f7744f6fc88779e95497b93e1e50222c1a91095372cf77c"} Oct 04 02:40:21 crc kubenswrapper[4964]: I1004 02:40:21.855228 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"425466b20966d1e97d85fc610bacd3a2c0a6d08b5288b9f23bdd9c1b0e920b44"} Oct 04 02:40:22 crc kubenswrapper[4964]: W1004 02:40:22.128511 4964 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:22 crc kubenswrapper[4964]: E1004 02:40:22.128970 4964 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 04 02:40:22 crc kubenswrapper[4964]: E1004 02:40:22.185565 4964 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Oct 04 02:40:22 crc kubenswrapper[4964]: W1004 02:40:22.316534 4964 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:22 crc kubenswrapper[4964]: E1004 02:40:22.316688 4964 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.440646 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.442796 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.442862 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.442882 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.442920 4964 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 02:40:22 crc kubenswrapper[4964]: E1004 02:40:22.443504 4964 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 04 02:40:22 crc kubenswrapper[4964]: W1004 02:40:22.443745 4964 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:22 crc kubenswrapper[4964]: E1004 02:40:22.443843 4964 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.774311 4964 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.860717 4964 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62" exitCode=0 Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.860801 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62"} Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.860910 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.862322 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.862384 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.862409 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.862913 4964 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="616b42c86af7b399da4a366796eaeb929444273db3b02f728e283bdfe28c3a09" exitCode=0 Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.863007 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"616b42c86af7b399da4a366796eaeb929444273db3b02f728e283bdfe28c3a09"} Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.863063 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.863922 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.863979 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.863997 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.864992 4964 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f" exitCode=0 Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.865056 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f"} Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.865101 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.866018 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.866313 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.866354 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.866371 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.867155 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.867205 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.867222 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.868714 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b"} Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.868757 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e"} Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.868773 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957"} Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.871049 4964 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb" exitCode=0 Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.871079 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb"} Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.871188 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.872555 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.872591 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:22 crc kubenswrapper[4964]: I1004 02:40:22.872603 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:23 crc kubenswrapper[4964]: W1004 02:40:23.502761 4964 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:23 crc kubenswrapper[4964]: E1004 02:40:23.502852 4964 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.774233 4964 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:23 crc kubenswrapper[4964]: E1004 02:40:23.786478 4964 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.880036 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1"} Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.880112 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49"} Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.880130 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3"} Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.880143 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0"} Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.883192 4964 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f89608ab72333c7613b66fecfe9f379e00c9eedc77bd9b8f7d7cc6594959f080" exitCode=0 Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.883306 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f89608ab72333c7613b66fecfe9f379e00c9eedc77bd9b8f7d7cc6594959f080"} Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.883411 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.885839 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.885890 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.885950 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.889880 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"87fdd14ec19d65c508672f966db71a585458cbea0a944a6960b56d8e0160eafe"} Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.889925 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.891293 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.891341 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.891354 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.896078 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418"} Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.896250 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.899825 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.899854 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.899869 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.905221 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a94e58235a34a6260cb38523b1823b73a5083dcff7f3af4167635bc394289072"} Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.905281 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"79910f39f896467b97e336bf2d9ddc46fc68ce22e15162a78865049569c03c81"} Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.905311 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"391bbf5e9b0494eb2fb5ab2d016230d876a51bd83eb52e759819dd11d54c1dc2"} Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.905353 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.906189 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.906213 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:23 crc kubenswrapper[4964]: I1004 02:40:23.906221 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.044158 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.045778 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.045829 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.045846 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.045877 4964 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 02:40:24 crc kubenswrapper[4964]: E1004 02:40:24.047197 4964 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 04 02:40:24 crc kubenswrapper[4964]: W1004 02:40:24.179457 4964 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:24 crc kubenswrapper[4964]: E1004 02:40:24.179543 4964 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 04 02:40:24 crc kubenswrapper[4964]: W1004 02:40:24.317428 4964 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:24 crc kubenswrapper[4964]: E1004 02:40:24.317567 4964 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 04 02:40:24 crc kubenswrapper[4964]: W1004 02:40:24.331606 4964 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 04 02:40:24 crc kubenswrapper[4964]: E1004 02:40:24.331685 4964 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.505896 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.914033 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106"} Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.914158 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.915683 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.915728 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.915746 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.916822 4964 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a1e907616bad5b8360e9b953786a3c8878ddf6ff319ca5a1ed32feec227a5e83" exitCode=0 Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.916907 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a1e907616bad5b8360e9b953786a3c8878ddf6ff319ca5a1ed32feec227a5e83"} Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.916938 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.916999 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.917084 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.917128 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.917086 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.918734 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.918788 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.918806 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.919262 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.919297 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.919486 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.919504 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.919330 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.919558 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.919581 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.919446 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.919680 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:24 crc kubenswrapper[4964]: I1004 02:40:24.990073 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.924319 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.925201 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"811f1cdfa9acfdae7c1a2cf93d39d272575ef104314e3ce20e6d9cd0decda3a5"} Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.925255 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"413f9903ad851ef827bc418226f805580239a49bee5e331f6c5f3738ef34c3b2"} Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.925274 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cbf54c8e3239be0eaaf2f90f7dd70463f5e21a956be327a7976fae4ea96b0530"} Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.925385 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.925775 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.925849 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.927020 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.927093 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.927115 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.928050 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.928095 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.928124 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.930837 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.930881 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:25 crc kubenswrapper[4964]: I1004 02:40:25.930909 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.931110 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"47d9927624ca800063b9c628444d84e1ff3cb447d4291d00921c8306e3a8ad5c"} Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.931169 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6c887b45983e03e922d4c6667c3095d5bbd78e1b43c7d570dd688ea4ccb94153"} Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.931201 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.931335 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.932075 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.932385 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.932404 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.932412 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.933083 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.933136 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.933154 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.933889 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.933931 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:26 crc kubenswrapper[4964]: I1004 02:40:26.933949 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.247411 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.249291 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.249363 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.249403 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.249444 4964 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.307415 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.506397 4964 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.506494 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.889455 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.934327 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.934409 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.935924 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.935974 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.935990 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.936331 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.936386 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:27 crc kubenswrapper[4964]: I1004 02:40:27.936406 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:28 crc kubenswrapper[4964]: I1004 02:40:28.937354 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:28 crc kubenswrapper[4964]: I1004 02:40:28.938515 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:28 crc kubenswrapper[4964]: I1004 02:40:28.938581 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:28 crc kubenswrapper[4964]: I1004 02:40:28.938603 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:29 crc kubenswrapper[4964]: I1004 02:40:29.729668 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:29 crc kubenswrapper[4964]: I1004 02:40:29.729899 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:29 crc kubenswrapper[4964]: I1004 02:40:29.731342 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:29 crc kubenswrapper[4964]: I1004 02:40:29.731388 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:29 crc kubenswrapper[4964]: I1004 02:40:29.731404 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:30 crc kubenswrapper[4964]: E1004 02:40:30.937741 4964 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 04 02:40:31 crc kubenswrapper[4964]: I1004 02:40:31.466722 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 04 02:40:31 crc kubenswrapper[4964]: I1004 02:40:31.466968 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:31 crc kubenswrapper[4964]: I1004 02:40:31.468803 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:31 crc kubenswrapper[4964]: I1004 02:40:31.468884 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:31 crc kubenswrapper[4964]: I1004 02:40:31.468908 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:31 crc kubenswrapper[4964]: I1004 02:40:31.653928 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:31 crc kubenswrapper[4964]: I1004 02:40:31.654198 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:31 crc kubenswrapper[4964]: I1004 02:40:31.655776 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:31 crc kubenswrapper[4964]: I1004 02:40:31.655835 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:31 crc kubenswrapper[4964]: I1004 02:40:31.655857 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:32 crc kubenswrapper[4964]: I1004 02:40:32.420685 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:32 crc kubenswrapper[4964]: I1004 02:40:32.420950 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:32 crc kubenswrapper[4964]: I1004 02:40:32.422832 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:32 crc kubenswrapper[4964]: I1004 02:40:32.422906 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:32 crc kubenswrapper[4964]: I1004 02:40:32.422925 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:32 crc kubenswrapper[4964]: I1004 02:40:32.429400 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:32 crc kubenswrapper[4964]: I1004 02:40:32.948140 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:32 crc kubenswrapper[4964]: I1004 02:40:32.949695 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:32 crc kubenswrapper[4964]: I1004 02:40:32.949756 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:32 crc kubenswrapper[4964]: I1004 02:40:32.949803 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:32 crc kubenswrapper[4964]: I1004 02:40:32.955525 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:33 crc kubenswrapper[4964]: I1004 02:40:33.953259 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:33 crc kubenswrapper[4964]: I1004 02:40:33.955170 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:33 crc kubenswrapper[4964]: I1004 02:40:33.955260 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:33 crc kubenswrapper[4964]: I1004 02:40:33.955318 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:34 crc kubenswrapper[4964]: I1004 02:40:34.774839 4964 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 04 02:40:35 crc kubenswrapper[4964]: I1004 02:40:35.504544 4964 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 04 02:40:35 crc kubenswrapper[4964]: I1004 02:40:35.504611 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 04 02:40:35 crc kubenswrapper[4964]: I1004 02:40:35.509572 4964 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 04 02:40:35 crc kubenswrapper[4964]: I1004 02:40:35.509710 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.348347 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.348670 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.350565 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.350663 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.350683 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.369724 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.506026 4964 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.506150 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.897239 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.897447 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.898946 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.898988 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.899004 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.903939 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.975383 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.975472 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.976782 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.976829 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.976846 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.977331 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.977389 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:37 crc kubenswrapper[4964]: I1004 02:40:37.977408 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.489850 4964 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.498944 4964 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.500570 4964 trace.go:236] Trace[344641525]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Oct-2025 02:40:29.240) (total time: 11260ms): Oct 04 02:40:40 crc kubenswrapper[4964]: Trace[344641525]: ---"Objects listed" error: 11260ms (02:40:40.500) Oct 04 02:40:40 crc kubenswrapper[4964]: Trace[344641525]: [11.260133558s] [11.260133558s] END Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.500689 4964 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.501198 4964 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.502568 4964 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.503070 4964 trace.go:236] Trace[1105883047]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Oct-2025 02:40:27.496) (total time: 13006ms): Oct 04 02:40:40 crc kubenswrapper[4964]: Trace[1105883047]: ---"Objects listed" error: 13006ms (02:40:40.502) Oct 04 02:40:40 crc kubenswrapper[4964]: Trace[1105883047]: [13.006644451s] [13.006644451s] END Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.503120 4964 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.503778 4964 trace.go:236] Trace[2003597336]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Oct-2025 02:40:30.268) (total time: 10234ms): Oct 04 02:40:40 crc kubenswrapper[4964]: Trace[2003597336]: ---"Objects listed" error: 10234ms (02:40:40.503) Oct 04 02:40:40 crc kubenswrapper[4964]: Trace[2003597336]: [10.234942759s] [10.234942759s] END Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.503813 4964 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.586558 4964 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48822->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.586880 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48822->192.168.126.11:17697: read: connection reset by peer" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.591517 4964 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.591554 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.770416 4964 apiserver.go:52] "Watching apiserver" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.773272 4964 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.773563 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.774012 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.774095 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.774145 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.774251 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.774307 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.774562 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.774660 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.774774 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.774827 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.776562 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.776652 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.776696 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.776654 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.776840 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.776564 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.776933 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.776978 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.778000 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.782701 4964 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.803769 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.803930 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.804329 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.805187 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.805652 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.806290 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.804034 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.804291 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.805063 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.805593 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.806188 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.806509 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:40:41.306490815 +0000 UTC m=+21.203449453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.807072 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.807416 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.807573 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.807676 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.808024 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.808486 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.808885 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.809182 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.809267 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.809689 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.810592 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.810683 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.810754 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.810819 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.810884 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.810952 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.811031 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.811097 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.811861 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.812487 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.812821 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.813350 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.813729 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.814011 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.814279 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.814479 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.814723 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.815434 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.815863 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.816178 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.816489 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.816819 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.817211 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.817883 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.818205 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.818473 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.818768 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.818903 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.819265 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.819695 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.820214 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.820397 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.820491 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.820776 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.821480 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.822005 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.807979 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.808435 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.815337 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.808853 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.809442 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.810334 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.810352 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.810557 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.811421 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.811594 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.811606 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.811667 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.811712 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.811977 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.812031 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.812182 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.812788 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.812911 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.813315 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.813697 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.813978 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.814247 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.814684 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.815116 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.815401 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.815824 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.816140 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.816449 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.816775 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.817178 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.817847 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.818174 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.818438 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.818736 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.819193 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.819353 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.819659 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.820064 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.820716 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.821094 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.821489 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.821567 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.821728 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.822308 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.823249 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827320 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827426 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827476 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827509 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827532 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827557 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827594 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827645 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827678 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827704 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827728 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827751 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827785 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827822 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827856 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827890 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827923 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827951 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.827994 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828065 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828107 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828139 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828171 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828205 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828251 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828289 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828333 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828376 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828413 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828451 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828522 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828558 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828592 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828644 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828677 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828714 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828757 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828795 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828827 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828864 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828902 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828930 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.828965 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829013 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829043 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829054 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829081 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829186 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829193 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829290 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829596 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829650 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829680 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829710 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829285 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829302 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.829964 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.830330 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.830403 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.830569 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.830663 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.831183 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.831226 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.831403 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.831537 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.831653 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.831920 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.832112 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.833210 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.833413 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.833474 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.833724 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.833832 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.834169 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.834193 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.834770 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.834918 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.835135 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.834894 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.834848 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.836497 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.835275 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.835302 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.835438 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.835507 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.836215 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.836226 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.836468 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.832238 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.836780 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.836820 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.836839 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.836851 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.836881 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.836947 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.837008 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.837301 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.837683 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838231 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838251 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838269 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838287 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838307 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838323 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838339 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838354 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838369 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838384 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838401 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838419 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838437 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838463 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838480 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838496 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838510 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838526 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838541 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838560 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838578 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838594 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838622 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838641 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838657 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838674 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838690 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838705 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838722 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838738 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.837405 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.837640 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.836708 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.837797 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.837600 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.837873 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838885 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838057 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838581 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838364 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838989 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.839528 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838760 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.839678 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.839708 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.839725 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.839800 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.839799 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.839801 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.839917 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.839961 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.839655 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840060 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840141 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.838468 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840211 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840033 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840461 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840470 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840548 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840585 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840694 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840749 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840799 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840843 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840877 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840911 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840950 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840986 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841026 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841062 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841101 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841136 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841175 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841209 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841246 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841282 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841318 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841355 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841387 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841427 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841464 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841502 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841536 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841580 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841640 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841682 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841725 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841761 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841801 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841837 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841877 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841921 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841959 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841991 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842025 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842001 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842061 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842096 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842129 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842164 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842195 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842230 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842263 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842295 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842329 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842363 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842398 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842434 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842496 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842542 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842579 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.840942 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842652 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841073 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841511 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841565 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842694 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.841995 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842010 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842047 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842733 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842080 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842124 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842756 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842771 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842321 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842453 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842479 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842486 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.844939 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845015 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845156 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845168 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845236 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.842810 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845440 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845543 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845672 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845759 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845839 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845911 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845982 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846047 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846174 4964 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846236 4964 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846292 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846348 4964 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846402 4964 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846454 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846512 4964 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846567 4964 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846699 4964 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846762 4964 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845762 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845961 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.845992 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846071 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846080 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846120 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846502 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846523 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846577 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846978 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.847236 4964 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.847321 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:41.34729862 +0000 UTC m=+21.244257368 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.847320 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.847826 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.847915 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.848027 4964 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.848207 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.848553 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.848884 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.849012 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.849485 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.846821 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.849774 4964 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.849817 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.849899 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:41.349836504 +0000 UTC m=+21.246795252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850035 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850074 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850517 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.849778 4964 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850587 4964 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850606 4964 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850646 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850658 4964 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850670 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850679 4964 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850691 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850702 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850711 4964 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850722 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850731 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850743 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850753 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850749 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850765 4964 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850821 4964 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850845 4964 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850865 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850883 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850907 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850928 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850948 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850967 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.850986 4964 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851003 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851021 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851038 4964 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851055 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851073 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851091 4964 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851118 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851135 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851152 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851169 4964 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851187 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851205 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851222 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851239 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851257 4964 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851274 4964 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851290 4964 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851307 4964 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851325 4964 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851342 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851361 4964 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851378 4964 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851394 4964 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851411 4964 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851429 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851448 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851468 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851487 4964 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851505 4964 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851523 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851541 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851560 4964 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851576 4964 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851596 4964 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851640 4964 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851659 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851676 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851693 4964 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851734 4964 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851751 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851769 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851789 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851807 4964 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851826 4964 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851843 4964 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851862 4964 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851923 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851940 4964 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851958 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851976 4964 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.851994 4964 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852011 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852028 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852047 4964 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852064 4964 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852082 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852183 4964 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852200 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852218 4964 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852235 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852252 4964 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852268 4964 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852286 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852304 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852321 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852338 4964 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852355 4964 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852374 4964 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852391 4964 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852409 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852426 4964 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852445 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852462 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852481 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852499 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852515 4964 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852533 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852551 4964 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852567 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852584 4964 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852602 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852833 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852853 4964 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852870 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852891 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852911 4964 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852927 4964 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852945 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.852963 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.853014 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.853034 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.853052 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.853070 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.853088 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.853105 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.853123 4964 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.853140 4964 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.853158 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.853509 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.854850 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.855682 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.856835 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.857114 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.863885 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.867395 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.867987 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.870238 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.870284 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.870298 4964 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.870566 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.870693 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:41.370677288 +0000 UTC m=+21.267635926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.870694 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.870735 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.870746 4964 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:40 crc kubenswrapper[4964]: E1004 02:40:40.870770 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:41.370763 +0000 UTC m=+21.267721638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.870752 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.871908 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.872752 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.872889 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.873024 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.874545 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.878196 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.878642 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.879062 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.879059 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.874655 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.877892 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.878458 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.879004 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.879704 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.879853 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.880127 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.880728 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.884257 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.884548 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.885473 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.885770 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.885778 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.886241 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.886471 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.886944 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.887980 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.888039 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.889560 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.889768 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.890216 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.890611 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.892000 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.895007 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.895577 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.896268 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.898786 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.899946 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.901045 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.904759 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.917430 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.920006 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.923667 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.924446 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.925944 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.926552 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.926650 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.928467 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.929515 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.930472 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.931789 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.932835 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.933427 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.936480 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.936517 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.939274 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.939928 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.941056 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.941448 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.942063 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.943381 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.945108 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.945678 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.945713 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.947728 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.948366 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.949401 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.950325 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.953908 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.953977 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954032 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954044 4964 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954055 4964 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954066 4964 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954074 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954084 4964 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954093 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954101 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954110 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954119 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954127 4964 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954136 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954144 4964 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954153 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954162 4964 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954172 4964 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954180 4964 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954189 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954197 4964 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954207 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954216 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954226 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954234 4964 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954242 4964 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954251 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954259 4964 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954266 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954274 4964 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954282 4964 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954292 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954300 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954308 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954316 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954335 4964 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954343 4964 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954352 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954360 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954370 4964 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954378 4964 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954385 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954393 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954401 4964 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954410 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954418 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954426 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954435 4964 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954443 4964 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954451 4964 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954458 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954467 4964 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954476 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954485 4964 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954496 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954505 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954512 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954520 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954527 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954539 4964 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954547 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954555 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954564 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.954634 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.955049 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.957051 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.957543 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.960667 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.961782 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.964056 4964 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.964163 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.966430 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.967295 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.976145 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.978315 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.992221 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 04 02:40:40 crc kubenswrapper[4964]: I1004 02:40:40.999387 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.000419 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.002231 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.004141 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.005660 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.024316 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.024588 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.025131 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.025430 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.026507 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.026807 4964 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106" exitCode=255 Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.026974 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.027481 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.028321 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.032740 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.033697 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.034229 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.035305 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.036130 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.036480 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.037241 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.037768 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.038268 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106"} Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.046003 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.054633 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.055031 4964 scope.go:117] "RemoveContainer" containerID="1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.055557 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.083910 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.091609 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.098165 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.103172 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.108287 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.124315 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.138235 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.169466 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.188070 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.199939 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.362229 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.362553 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.362578 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.362682 4964 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.362729 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:42.36271598 +0000 UTC m=+22.259674618 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.363013 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:40:42.363006358 +0000 UTC m=+22.259964996 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.363044 4964 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.363063 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:42.36305793 +0000 UTC m=+22.260016568 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.463118 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.463212 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.463276 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.463310 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.463322 4964 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.463341 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.463360 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.463370 4964 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.463380 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:42.463362979 +0000 UTC m=+22.360321617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.463406 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:42.46338771 +0000 UTC m=+22.360346348 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.844477 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:41 crc kubenswrapper[4964]: E1004 02:40:41.844639 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.991412 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-w556r"] Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.991908 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w556r" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.994601 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.996413 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 04 02:40:41 crc kubenswrapper[4964]: I1004 02:40:41.996714 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.019157 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.029781 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6"} Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.029825 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5"} Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.029836 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cb276f3fa3defa12c75d035c23b2d91d3bc0d4cb7da71d1b55d5347713344ac2"} Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.030877 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620"} Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.030941 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0be2ab009a3b10e9cdd59dc67963b7aba5ece663940fbd9f055fe462893d9357"} Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.032286 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.033763 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789"} Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.033996 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.034661 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"047f76240db2d2c14b1b35d20a6e2c1d8afecd59eca1388c8821c7eb5ab4b9b1"} Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.044110 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.064964 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.068300 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf670ba7-2bcb-4d80-b655-289c47e35cf1-hosts-file\") pod \"node-resolver-w556r\" (UID: \"cf670ba7-2bcb-4d80-b655-289c47e35cf1\") " pod="openshift-dns/node-resolver-w556r" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.068355 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dvpk\" (UniqueName: \"kubernetes.io/projected/cf670ba7-2bcb-4d80-b655-289c47e35cf1-kube-api-access-4dvpk\") pod \"node-resolver-w556r\" (UID: \"cf670ba7-2bcb-4d80-b655-289c47e35cf1\") " pod="openshift-dns/node-resolver-w556r" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.086815 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.105714 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.116105 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.128078 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.141862 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.157199 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.170039 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dvpk\" (UniqueName: \"kubernetes.io/projected/cf670ba7-2bcb-4d80-b655-289c47e35cf1-kube-api-access-4dvpk\") pod \"node-resolver-w556r\" (UID: \"cf670ba7-2bcb-4d80-b655-289c47e35cf1\") " pod="openshift-dns/node-resolver-w556r" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.170667 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf670ba7-2bcb-4d80-b655-289c47e35cf1-hosts-file\") pod \"node-resolver-w556r\" (UID: \"cf670ba7-2bcb-4d80-b655-289c47e35cf1\") " pod="openshift-dns/node-resolver-w556r" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.170600 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf670ba7-2bcb-4d80-b655-289c47e35cf1-hosts-file\") pod \"node-resolver-w556r\" (UID: \"cf670ba7-2bcb-4d80-b655-289c47e35cf1\") " pod="openshift-dns/node-resolver-w556r" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.171466 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.182410 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.193889 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dvpk\" (UniqueName: \"kubernetes.io/projected/cf670ba7-2bcb-4d80-b655-289c47e35cf1-kube-api-access-4dvpk\") pod \"node-resolver-w556r\" (UID: \"cf670ba7-2bcb-4d80-b655-289c47e35cf1\") " pod="openshift-dns/node-resolver-w556r" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.194914 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.205832 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.217815 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.227878 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.243032 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.307580 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w556r" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.372430 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.372514 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.372561 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.372670 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:40:44.372643666 +0000 UTC m=+24.269602304 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.372712 4964 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.372773 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:44.372757489 +0000 UTC m=+24.269716137 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.372772 4964 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.372881 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:44.372854832 +0000 UTC m=+24.269813510 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:40:42 crc kubenswrapper[4964]: W1004 02:40:42.386740 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf670ba7_2bcb_4d80_b655_289c47e35cf1.slice/crio-25d2e6f46526b359732881d6e3368c787e30f74d6ecd2d2163259d80712ba9dd WatchSource:0}: Error finding container 25d2e6f46526b359732881d6e3368c787e30f74d6ecd2d2163259d80712ba9dd: Status 404 returned error can't find the container with id 25d2e6f46526b359732881d6e3368c787e30f74d6ecd2d2163259d80712ba9dd Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.395157 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-q6hm8"] Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.395516 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-prcqh"] Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.395663 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.396500 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-m7mv7"] Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.396749 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.397003 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: W1004 02:40:42.399539 4964 reflector.go:561] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": failed to list *v1.Secret: secrets "machine-config-daemon-dockercfg-r5tcq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.399586 4964 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-r5tcq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-config-daemon-dockercfg-r5tcq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 02:40:42 crc kubenswrapper[4964]: W1004 02:40:42.399829 4964 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.399866 4964 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.399924 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.399933 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.400005 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.399936 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.400110 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.400495 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.400712 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 04 02:40:42 crc kubenswrapper[4964]: W1004 02:40:42.400846 4964 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.400890 4964 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.401016 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.402011 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xrs78"] Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.403054 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.403823 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.407387 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.407449 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.407932 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.408279 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.408405 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.408522 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.408709 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.422143 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.441598 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.465149 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473607 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-system-cni-dir\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473684 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473712 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95c02c3c-a484-46f9-a96d-8650b8f9c67f-proxy-tls\") pod \"machine-config-daemon-m7mv7\" (UID: \"95c02c3c-a484-46f9-a96d-8650b8f9c67f\") " pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473737 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/10ea848d-0322-476d-976d-4ae3ac39910b-multus-daemon-config\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473762 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-var-lib-openvswitch\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473788 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-cni-bin\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473826 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473851 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-openvswitch\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473879 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473902 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95c02c3c-a484-46f9-a96d-8650b8f9c67f-mcd-auth-proxy-config\") pod \"machine-config-daemon-m7mv7\" (UID: \"95c02c3c-a484-46f9-a96d-8650b8f9c67f\") " pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473927 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-etc-kubernetes\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473952 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-run-netns\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473974 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-run-netns\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.473998 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-ovn\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474023 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-cni-netd\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474062 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqkx7\" (UniqueName: \"kubernetes.io/projected/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-kube-api-access-mqkx7\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.474086 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.474140 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.474166 4964 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474095 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-multus-cni-dir\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.474251 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:44.474221958 +0000 UTC m=+24.371180636 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474307 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-os-release\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474348 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-run-k8s-cni-cncf-io\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474374 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-var-lib-cni-bin\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474410 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e36b0a0d-d6be-4917-a161-26245a74904a-cnibin\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474434 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-etc-openvswitch\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474490 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-var-lib-cni-multus\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474511 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-env-overrides\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474583 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovn-node-metrics-cert\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474697 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsxpm\" (UniqueName: \"kubernetes.io/projected/10ea848d-0322-476d-976d-4ae3ac39910b-kube-api-access-fsxpm\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474734 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e36b0a0d-d6be-4917-a161-26245a74904a-os-release\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474769 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-var-lib-kubelet\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474839 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-run-multus-certs\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474895 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e36b0a0d-d6be-4917-a161-26245a74904a-system-cni-dir\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474933 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-systemd\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474957 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-node-log\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474979 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znwzc\" (UniqueName: \"kubernetes.io/projected/95c02c3c-a484-46f9-a96d-8650b8f9c67f-kube-api-access-znwzc\") pod \"machine-config-daemon-m7mv7\" (UID: \"95c02c3c-a484-46f9-a96d-8650b8f9c67f\") " pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.474999 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-multus-conf-dir\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475022 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-systemd-units\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475093 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475144 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-cnibin\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475249 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10ea848d-0322-476d-976d-4ae3ac39910b-cni-binary-copy\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.475254 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475277 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e36b0a0d-d6be-4917-a161-26245a74904a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.475287 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.475309 4964 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475313 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-kubelet\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475356 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-log-socket\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.475435 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:44.475417139 +0000 UTC m=+24.372375787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475531 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-multus-socket-dir-parent\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475575 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-hostroot\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475608 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e36b0a0d-d6be-4917-a161-26245a74904a-cni-binary-copy\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475703 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nn56\" (UniqueName: \"kubernetes.io/projected/e36b0a0d-d6be-4917-a161-26245a74904a-kube-api-access-5nn56\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475757 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-slash\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475828 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovnkube-config\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475860 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovnkube-script-lib\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475891 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95c02c3c-a484-46f9-a96d-8650b8f9c67f-rootfs\") pod \"machine-config-daemon-m7mv7\" (UID: \"95c02c3c-a484-46f9-a96d-8650b8f9c67f\") " pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.475938 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e36b0a0d-d6be-4917-a161-26245a74904a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.485325 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.503985 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.547698 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.562732 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576512 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-run-netns\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576553 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-run-netns\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576567 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-ovn\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576587 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-cni-netd\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576602 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqkx7\" (UniqueName: \"kubernetes.io/projected/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-kube-api-access-mqkx7\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576641 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-multus-cni-dir\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576660 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-os-release\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576674 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-run-k8s-cni-cncf-io\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576692 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-var-lib-cni-bin\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576701 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-cni-netd\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576737 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-ovn\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576768 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-run-netns\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576707 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e36b0a0d-d6be-4917-a161-26245a74904a-cnibin\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576750 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e36b0a0d-d6be-4917-a161-26245a74904a-cnibin\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576806 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-etc-openvswitch\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576838 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-multus-cni-dir\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576850 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-run-k8s-cni-cncf-io\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576871 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-etc-openvswitch\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576842 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-var-lib-cni-multus\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576887 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-var-lib-cni-bin\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576899 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-env-overrides\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576666 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-run-netns\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576917 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovn-node-metrics-cert\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576936 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsxpm\" (UniqueName: \"kubernetes.io/projected/10ea848d-0322-476d-976d-4ae3ac39910b-kube-api-access-fsxpm\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576954 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e36b0a0d-d6be-4917-a161-26245a74904a-os-release\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576970 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-var-lib-kubelet\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576984 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-run-multus-certs\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576991 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-os-release\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577001 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e36b0a0d-d6be-4917-a161-26245a74904a-system-cni-dir\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577019 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-systemd\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577035 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-node-log\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577046 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e36b0a0d-d6be-4917-a161-26245a74904a-os-release\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577052 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znwzc\" (UniqueName: \"kubernetes.io/projected/95c02c3c-a484-46f9-a96d-8650b8f9c67f-kube-api-access-znwzc\") pod \"machine-config-daemon-m7mv7\" (UID: \"95c02c3c-a484-46f9-a96d-8650b8f9c67f\") " pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.576857 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-var-lib-cni-multus\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577073 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-multus-conf-dir\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577098 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e36b0a0d-d6be-4917-a161-26245a74904a-system-cni-dir\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577106 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-systemd-units\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577123 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-node-log\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577145 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-cnibin\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577151 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-systemd-units\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577160 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10ea848d-0322-476d-976d-4ae3ac39910b-cni-binary-copy\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577184 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e36b0a0d-d6be-4917-a161-26245a74904a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577201 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-kubelet\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577217 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-log-socket\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577241 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-multus-socket-dir-parent\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577258 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-hostroot\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577274 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e36b0a0d-d6be-4917-a161-26245a74904a-cni-binary-copy\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577290 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nn56\" (UniqueName: \"kubernetes.io/projected/e36b0a0d-d6be-4917-a161-26245a74904a-kube-api-access-5nn56\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577306 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-slash\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577324 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovnkube-config\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577346 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovnkube-script-lib\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577371 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95c02c3c-a484-46f9-a96d-8650b8f9c67f-rootfs\") pod \"machine-config-daemon-m7mv7\" (UID: \"95c02c3c-a484-46f9-a96d-8650b8f9c67f\") " pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577398 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e36b0a0d-d6be-4917-a161-26245a74904a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577422 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-system-cni-dir\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577441 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577472 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95c02c3c-a484-46f9-a96d-8650b8f9c67f-proxy-tls\") pod \"machine-config-daemon-m7mv7\" (UID: \"95c02c3c-a484-46f9-a96d-8650b8f9c67f\") " pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577498 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/10ea848d-0322-476d-976d-4ae3ac39910b-multus-daemon-config\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577520 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-var-lib-openvswitch\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577538 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-cni-bin\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577572 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-openvswitch\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577593 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577648 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-env-overrides\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577658 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95c02c3c-a484-46f9-a96d-8650b8f9c67f-mcd-auth-proxy-config\") pod \"machine-config-daemon-m7mv7\" (UID: \"95c02c3c-a484-46f9-a96d-8650b8f9c67f\") " pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577018 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-var-lib-kubelet\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577695 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-etc-kubernetes\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577712 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-systemd\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577718 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10ea848d-0322-476d-976d-4ae3ac39910b-cni-binary-copy\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577754 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-etc-kubernetes\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577760 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-cnibin\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577761 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578013 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95c02c3c-a484-46f9-a96d-8650b8f9c67f-rootfs\") pod \"machine-config-daemon-m7mv7\" (UID: \"95c02c3c-a484-46f9-a96d-8650b8f9c67f\") " pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578111 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-system-cni-dir\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578138 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-cni-bin\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578384 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e36b0a0d-d6be-4917-a161-26245a74904a-cni-binary-copy\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578481 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovnkube-script-lib\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578534 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-slash\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578684 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/10ea848d-0322-476d-976d-4ae3ac39910b-multus-daemon-config\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578732 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-var-lib-openvswitch\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578751 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-multus-conf-dir\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578774 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.577083 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-host-run-multus-certs\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578802 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-openvswitch\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578826 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-log-socket\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578847 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-kubelet\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578878 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-multus-socket-dir-parent\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.578903 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/10ea848d-0322-476d-976d-4ae3ac39910b-hostroot\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.579101 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovnkube-config\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.579373 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95c02c3c-a484-46f9-a96d-8650b8f9c67f-mcd-auth-proxy-config\") pod \"machine-config-daemon-m7mv7\" (UID: \"95c02c3c-a484-46f9-a96d-8650b8f9c67f\") " pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.579452 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e36b0a0d-d6be-4917-a161-26245a74904a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.583118 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95c02c3c-a484-46f9-a96d-8650b8f9c67f-proxy-tls\") pod \"machine-config-daemon-m7mv7\" (UID: \"95c02c3c-a484-46f9-a96d-8650b8f9c67f\") " pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.585058 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovn-node-metrics-cert\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.592235 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.604375 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nn56\" (UniqueName: \"kubernetes.io/projected/e36b0a0d-d6be-4917-a161-26245a74904a-kube-api-access-5nn56\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.607768 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqkx7\" (UniqueName: \"kubernetes.io/projected/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-kube-api-access-mqkx7\") pod \"ovnkube-node-xrs78\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.610265 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsxpm\" (UniqueName: \"kubernetes.io/projected/10ea848d-0322-476d-976d-4ae3ac39910b-kube-api-access-fsxpm\") pod \"multus-q6hm8\" (UID: \"10ea848d-0322-476d-976d-4ae3ac39910b\") " pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.611040 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znwzc\" (UniqueName: \"kubernetes.io/projected/95c02c3c-a484-46f9-a96d-8650b8f9c67f-kube-api-access-znwzc\") pod \"machine-config-daemon-m7mv7\" (UID: \"95c02c3c-a484-46f9-a96d-8650b8f9c67f\") " pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.622673 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.636241 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.653723 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.670989 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.684539 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.694683 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.708108 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.722818 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.743580 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.757390 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.773805 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.784959 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.802225 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:42Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.809747 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q6hm8" Oct 04 02:40:42 crc kubenswrapper[4964]: W1004 02:40:42.822077 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10ea848d_0322_476d_976d_4ae3ac39910b.slice/crio-f485145fd1141475f4d19833d68b078dcfcefc32a850aa5325b8deaefe9d7971 WatchSource:0}: Error finding container f485145fd1141475f4d19833d68b078dcfcefc32a850aa5325b8deaefe9d7971: Status 404 returned error can't find the container with id f485145fd1141475f4d19833d68b078dcfcefc32a850aa5325b8deaefe9d7971 Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.827711 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:42 crc kubenswrapper[4964]: W1004 02:40:42.841638 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74942bdc_b3cd_4b92_8b6e_0daf7c89e4e9.slice/crio-9a613c2410497bf33723cd282e13fa4d849033737eca1cf90b1df50904a015b5 WatchSource:0}: Error finding container 9a613c2410497bf33723cd282e13fa4d849033737eca1cf90b1df50904a015b5: Status 404 returned error can't find the container with id 9a613c2410497bf33723cd282e13fa4d849033737eca1cf90b1df50904a015b5 Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.846169 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.846258 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:40:42 crc kubenswrapper[4964]: I1004 02:40:42.846300 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:42 crc kubenswrapper[4964]: E1004 02:40:42.846339 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.038186 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d" exitCode=0 Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.038258 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d"} Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.038313 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"9a613c2410497bf33723cd282e13fa4d849033737eca1cf90b1df50904a015b5"} Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.039727 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q6hm8" event={"ID":"10ea848d-0322-476d-976d-4ae3ac39910b","Type":"ContainerStarted","Data":"b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709"} Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.039752 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q6hm8" event={"ID":"10ea848d-0322-476d-976d-4ae3ac39910b","Type":"ContainerStarted","Data":"f485145fd1141475f4d19833d68b078dcfcefc32a850aa5325b8deaefe9d7971"} Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.040781 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w556r" event={"ID":"cf670ba7-2bcb-4d80-b655-289c47e35cf1","Type":"ContainerStarted","Data":"a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2"} Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.040813 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w556r" event={"ID":"cf670ba7-2bcb-4d80-b655-289c47e35cf1","Type":"ContainerStarted","Data":"25d2e6f46526b359732881d6e3368c787e30f74d6ecd2d2163259d80712ba9dd"} Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.063997 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.077215 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.092224 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.106890 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.118782 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.129422 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.141984 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.161575 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.171695 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.186438 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.198263 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.208058 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.221394 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.233667 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.251088 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.270778 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.287997 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.301994 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.317976 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.333531 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.352512 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.368494 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.389526 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.405466 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:43Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:43 crc kubenswrapper[4964]: E1004 02:40:43.578661 4964 configmap.go:193] Couldn't get configMap openshift-multus/default-cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Oct 04 02:40:43 crc kubenswrapper[4964]: E1004 02:40:43.579035 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e36b0a0d-d6be-4917-a161-26245a74904a-cni-sysctl-allowlist podName:e36b0a0d-d6be-4917-a161-26245a74904a nodeName:}" failed. No retries permitted until 2025-10-04 02:40:44.079017368 +0000 UTC m=+23.975976006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/e36b0a0d-d6be-4917-a161-26245a74904a-cni-sysctl-allowlist") pod "multus-additional-cni-plugins-prcqh" (UID: "e36b0a0d-d6be-4917-a161-26245a74904a") : failed to sync configmap cache: timed out waiting for the condition Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.656154 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.790776 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.824130 4964 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" secret="" err="failed to sync secret cache: timed out waiting for the condition" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.824440 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:40:43 crc kubenswrapper[4964]: W1004 02:40:43.841474 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95c02c3c_a484_46f9_a96d_8650b8f9c67f.slice/crio-9070c1082de143fc015775cf4dce11076f5d8e2fe82c3389da4f7ee21670b611 WatchSource:0}: Error finding container 9070c1082de143fc015775cf4dce11076f5d8e2fe82c3389da4f7ee21670b611: Status 404 returned error can't find the container with id 9070c1082de143fc015775cf4dce11076f5d8e2fe82c3389da4f7ee21670b611 Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.844484 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:43 crc kubenswrapper[4964]: E1004 02:40:43.844738 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:40:43 crc kubenswrapper[4964]: I1004 02:40:43.870449 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.044674 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870"} Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.045946 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387"} Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.045988 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"9070c1082de143fc015775cf4dce11076f5d8e2fe82c3389da4f7ee21670b611"} Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.050164 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032"} Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.050196 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16"} Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.050207 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54"} Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.050216 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91"} Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.050226 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede"} Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.050235 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f"} Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.059505 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.086784 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.092946 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e36b0a0d-d6be-4917-a161-26245a74904a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.093462 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e36b0a0d-d6be-4917-a161-26245a74904a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-prcqh\" (UID: \"e36b0a0d-d6be-4917-a161-26245a74904a\") " pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.105096 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.117558 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.128846 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.145254 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.161589 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.180751 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.201969 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.215981 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.232367 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.247093 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.317643 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-prcqh" Oct 04 02:40:44 crc kubenswrapper[4964]: W1004 02:40:44.332456 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode36b0a0d_d6be_4917_a161_26245a74904a.slice/crio-8da7287c976828a84995673dec1b2b5a3923835c73095183900d2987187cbe68 WatchSource:0}: Error finding container 8da7287c976828a84995673dec1b2b5a3923835c73095183900d2987187cbe68: Status 404 returned error can't find the container with id 8da7287c976828a84995673dec1b2b5a3923835c73095183900d2987187cbe68 Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.396327 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.396425 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:40:48.396402165 +0000 UTC m=+28.293360813 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.396474 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.396529 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.396592 4964 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.396647 4964 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.396697 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:48.396687122 +0000 UTC m=+28.293645770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.396715 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:48.396707773 +0000 UTC m=+28.293666411 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.497104 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.497178 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.497323 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.497346 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.497360 4964 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.497411 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:48.497393533 +0000 UTC m=+28.394352171 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.497572 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.497711 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.497839 4964 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.497882 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:48.497871576 +0000 UTC m=+28.394830214 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.509954 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.513908 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.520712 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.527597 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.540858 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.553220 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.568288 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.588105 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.622008 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.648412 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.674641 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.693257 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.711118 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.727720 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.740677 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.751112 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.765683 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.775300 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.787929 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.806471 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.822663 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.835928 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.844674 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.844820 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.844687 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:44 crc kubenswrapper[4964]: E1004 02:40:44.845067 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.852894 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.867967 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.879878 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.895458 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.911648 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:44 crc kubenswrapper[4964]: I1004 02:40:44.925225 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:44Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.028154 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bnp9l"] Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.028895 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bnp9l" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.030403 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.030627 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.030903 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.031170 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.046202 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.057496 4964 generic.go:334] "Generic (PLEG): container finished" podID="e36b0a0d-d6be-4917-a161-26245a74904a" containerID="eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45" exitCode=0 Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.057577 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" event={"ID":"e36b0a0d-d6be-4917-a161-26245a74904a","Type":"ContainerDied","Data":"eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45"} Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.057606 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" event={"ID":"e36b0a0d-d6be-4917-a161-26245a74904a","Type":"ContainerStarted","Data":"8da7287c976828a84995673dec1b2b5a3923835c73095183900d2987187cbe68"} Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.059889 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd"} Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.063116 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.076765 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.103205 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.104032 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/12ce3232-4729-4910-9890-a3da4586342c-serviceca\") pod \"node-ca-bnp9l\" (UID: \"12ce3232-4729-4910-9890-a3da4586342c\") " pod="openshift-image-registry/node-ca-bnp9l" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.104086 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12ce3232-4729-4910-9890-a3da4586342c-host\") pod \"node-ca-bnp9l\" (UID: \"12ce3232-4729-4910-9890-a3da4586342c\") " pod="openshift-image-registry/node-ca-bnp9l" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.104177 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5bjx\" (UniqueName: \"kubernetes.io/projected/12ce3232-4729-4910-9890-a3da4586342c-kube-api-access-q5bjx\") pod \"node-ca-bnp9l\" (UID: \"12ce3232-4729-4910-9890-a3da4586342c\") " pod="openshift-image-registry/node-ca-bnp9l" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.115597 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.141768 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.155806 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.166568 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.178836 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.194441 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.205286 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/12ce3232-4729-4910-9890-a3da4586342c-serviceca\") pod \"node-ca-bnp9l\" (UID: \"12ce3232-4729-4910-9890-a3da4586342c\") " pod="openshift-image-registry/node-ca-bnp9l" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.205332 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12ce3232-4729-4910-9890-a3da4586342c-host\") pod \"node-ca-bnp9l\" (UID: \"12ce3232-4729-4910-9890-a3da4586342c\") " pod="openshift-image-registry/node-ca-bnp9l" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.205465 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5bjx\" (UniqueName: \"kubernetes.io/projected/12ce3232-4729-4910-9890-a3da4586342c-kube-api-access-q5bjx\") pod \"node-ca-bnp9l\" (UID: \"12ce3232-4729-4910-9890-a3da4586342c\") " pod="openshift-image-registry/node-ca-bnp9l" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.206203 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12ce3232-4729-4910-9890-a3da4586342c-host\") pod \"node-ca-bnp9l\" (UID: \"12ce3232-4729-4910-9890-a3da4586342c\") " pod="openshift-image-registry/node-ca-bnp9l" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.207687 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/12ce3232-4729-4910-9890-a3da4586342c-serviceca\") pod \"node-ca-bnp9l\" (UID: \"12ce3232-4729-4910-9890-a3da4586342c\") " pod="openshift-image-registry/node-ca-bnp9l" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.213877 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.224952 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5bjx\" (UniqueName: \"kubernetes.io/projected/12ce3232-4729-4910-9890-a3da4586342c-kube-api-access-q5bjx\") pod \"node-ca-bnp9l\" (UID: \"12ce3232-4729-4910-9890-a3da4586342c\") " pod="openshift-image-registry/node-ca-bnp9l" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.233376 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.247781 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.260064 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.287969 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.301736 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.323441 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.363416 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.407221 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.423920 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bnp9l" Oct 04 02:40:45 crc kubenswrapper[4964]: W1004 02:40:45.439304 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12ce3232_4729_4910_9890_a3da4586342c.slice/crio-5d3093cf48444687696e08cbe18e3dcd7563e7552d361dd364b79a786371cf9d WatchSource:0}: Error finding container 5d3093cf48444687696e08cbe18e3dcd7563e7552d361dd364b79a786371cf9d: Status 404 returned error can't find the container with id 5d3093cf48444687696e08cbe18e3dcd7563e7552d361dd364b79a786371cf9d Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.453989 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.484969 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.520230 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.565460 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.601885 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.642081 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.681552 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.729555 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.764436 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:45Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:45 crc kubenswrapper[4964]: I1004 02:40:45.845096 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:45 crc kubenswrapper[4964]: E1004 02:40:45.845269 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.065325 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bnp9l" event={"ID":"12ce3232-4729-4910-9890-a3da4586342c","Type":"ContainerStarted","Data":"03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420"} Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.065394 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bnp9l" event={"ID":"12ce3232-4729-4910-9890-a3da4586342c","Type":"ContainerStarted","Data":"5d3093cf48444687696e08cbe18e3dcd7563e7552d361dd364b79a786371cf9d"} Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.073543 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f"} Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.078213 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" event={"ID":"e36b0a0d-d6be-4917-a161-26245a74904a","Type":"ContainerStarted","Data":"e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06"} Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.090058 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.104777 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.118182 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.138068 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.153609 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.169641 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.186800 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.198789 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.215053 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.231403 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.248943 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.271282 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.294872 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.327587 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.401755 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.431926 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.443579 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.482373 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.529168 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.565247 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.603993 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.648631 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.684374 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.725514 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.764249 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.805402 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.848847 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.848923 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:46 crc kubenswrapper[4964]: E1004 02:40:46.849051 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:40:46 crc kubenswrapper[4964]: E1004 02:40:46.849681 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.850992 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.883744 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.901733 4964 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.904807 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.904835 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.904846 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.904940 4964 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.915686 4964 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.915873 4964 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.917478 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.917522 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.917531 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.917544 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.917554 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:46Z","lastTransitionTime":"2025-10-04T02:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:46 crc kubenswrapper[4964]: E1004 02:40:46.944365 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.949320 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.949358 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.949375 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.949400 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.949418 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:46Z","lastTransitionTime":"2025-10-04T02:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:46 crc kubenswrapper[4964]: E1004 02:40:46.963728 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.968453 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.968499 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.968511 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.968528 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.968539 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:46Z","lastTransitionTime":"2025-10-04T02:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:46 crc kubenswrapper[4964]: E1004 02:40:46.986829 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:46Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.991571 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.991604 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.991630 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.991646 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:46 crc kubenswrapper[4964]: I1004 02:40:46.991655 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:46Z","lastTransitionTime":"2025-10-04T02:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:47 crc kubenswrapper[4964]: E1004 02:40:47.004627 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.008567 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.008830 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.008990 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.009164 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.009301 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:47Z","lastTransitionTime":"2025-10-04T02:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:47 crc kubenswrapper[4964]: E1004 02:40:47.026006 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: E1004 02:40:47.026193 4964 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.028470 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.028495 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.028504 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.028517 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.028527 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:47Z","lastTransitionTime":"2025-10-04T02:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.082791 4964 generic.go:334] "Generic (PLEG): container finished" podID="e36b0a0d-d6be-4917-a161-26245a74904a" containerID="e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06" exitCode=0 Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.082846 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" event={"ID":"e36b0a0d-d6be-4917-a161-26245a74904a","Type":"ContainerDied","Data":"e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06"} Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.101129 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.118924 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.134710 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.134775 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.134794 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.134818 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.134835 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:47Z","lastTransitionTime":"2025-10-04T02:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.140372 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.165238 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.187747 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.208067 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.221845 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.238383 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.238421 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.238435 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.238455 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.238471 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:47Z","lastTransitionTime":"2025-10-04T02:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.243067 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.287424 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.325517 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.340844 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.340884 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.340895 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.340915 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.340927 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:47Z","lastTransitionTime":"2025-10-04T02:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.363232 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.402399 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.441834 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.442793 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.442823 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.442832 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.442847 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.442856 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:47Z","lastTransitionTime":"2025-10-04T02:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.487599 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:47Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.545502 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.545536 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.545545 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.545559 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.545580 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:47Z","lastTransitionTime":"2025-10-04T02:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.648483 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.648554 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.648579 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.648611 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.648679 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:47Z","lastTransitionTime":"2025-10-04T02:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.751893 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.751971 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.751989 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.752017 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.752035 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:47Z","lastTransitionTime":"2025-10-04T02:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.844278 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:47 crc kubenswrapper[4964]: E1004 02:40:47.844459 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.854722 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.854799 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.854817 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.854849 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.854870 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:47Z","lastTransitionTime":"2025-10-04T02:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.958525 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.958586 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.958605 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.958656 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:47 crc kubenswrapper[4964]: I1004 02:40:47.958677 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:47Z","lastTransitionTime":"2025-10-04T02:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.062233 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.062789 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.062812 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.062836 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.062855 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:48Z","lastTransitionTime":"2025-10-04T02:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.094252 4964 generic.go:334] "Generic (PLEG): container finished" podID="e36b0a0d-d6be-4917-a161-26245a74904a" containerID="74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8" exitCode=0 Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.094397 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" event={"ID":"e36b0a0d-d6be-4917-a161-26245a74904a","Type":"ContainerDied","Data":"74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8"} Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.104712 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16"} Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.106258 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.106357 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.118947 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.139097 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.156608 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.158257 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.159004 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.167756 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.167803 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.167821 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.167844 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.167864 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:48Z","lastTransitionTime":"2025-10-04T02:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.173120 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.189278 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.203992 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.222196 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.246635 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.268272 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.269264 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.269288 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.269297 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.269313 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.269322 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:48Z","lastTransitionTime":"2025-10-04T02:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.280729 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.295003 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.307028 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.320596 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.334012 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.348969 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.362335 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.372111 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.372143 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.372157 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.372176 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.372188 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:48Z","lastTransitionTime":"2025-10-04T02:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.379916 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.395914 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.412777 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.431782 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.449822 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.450095 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.450275 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.450314 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:40:56.450285814 +0000 UTC m=+36.347244482 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.450406 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.450469 4964 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.450551 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:56.450526961 +0000 UTC m=+36.347485799 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.450593 4964 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.450710 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:56.450687935 +0000 UTC m=+36.347646583 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.465841 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.475451 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.475509 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.475522 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.475543 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.475553 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:48Z","lastTransitionTime":"2025-10-04T02:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.485342 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.503107 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.520784 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.542465 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.548535 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.550891 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.550941 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.551068 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.551084 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.551095 4964 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.551093 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.551121 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.551132 4964 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.551143 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:56.551128617 +0000 UTC m=+36.448087255 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.551184 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:56.551167168 +0000 UTC m=+36.448125806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.565149 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.577726 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.577763 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.577772 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.577787 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.577798 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:48Z","lastTransitionTime":"2025-10-04T02:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.602527 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.680659 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.680714 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.680724 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.680740 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.680754 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:48Z","lastTransitionTime":"2025-10-04T02:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.784171 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.784226 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.784242 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.784268 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.784284 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:48Z","lastTransitionTime":"2025-10-04T02:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.849157 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.857417 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.857577 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:40:48 crc kubenswrapper[4964]: E1004 02:40:48.857662 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.886511 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.886567 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.886581 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.886642 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.886658 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:48Z","lastTransitionTime":"2025-10-04T02:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.988911 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.988967 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.988976 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.988990 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:48 crc kubenswrapper[4964]: I1004 02:40:48.988999 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:48Z","lastTransitionTime":"2025-10-04T02:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.091855 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.091925 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.091946 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.091973 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.091992 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:49Z","lastTransitionTime":"2025-10-04T02:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.113989 4964 generic.go:334] "Generic (PLEG): container finished" podID="e36b0a0d-d6be-4917-a161-26245a74904a" containerID="2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95" exitCode=0 Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.114108 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" event={"ID":"e36b0a0d-d6be-4917-a161-26245a74904a","Type":"ContainerDied","Data":"2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95"} Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.137645 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.164922 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.190133 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.195592 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.195677 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.195693 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.195711 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.195724 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:49Z","lastTransitionTime":"2025-10-04T02:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.211896 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.228257 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.245774 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.262554 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.276858 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.293077 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.298178 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.298230 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.298248 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.298272 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.298291 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:49Z","lastTransitionTime":"2025-10-04T02:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.314138 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.338975 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.361382 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.376810 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.396338 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:49Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.402249 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.402308 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.402337 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.402363 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.402381 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:49Z","lastTransitionTime":"2025-10-04T02:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.505134 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.505199 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.505217 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.505242 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.505259 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:49Z","lastTransitionTime":"2025-10-04T02:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.609271 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.609329 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.609346 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.609373 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.609393 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:49Z","lastTransitionTime":"2025-10-04T02:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.715097 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.715154 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.715173 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.715199 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.715215 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:49Z","lastTransitionTime":"2025-10-04T02:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.817963 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.818023 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.818039 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.818063 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.818082 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:49Z","lastTransitionTime":"2025-10-04T02:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.844971 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:49 crc kubenswrapper[4964]: E1004 02:40:49.845241 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.921054 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.921135 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.921153 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.921179 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:49 crc kubenswrapper[4964]: I1004 02:40:49.921195 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:49Z","lastTransitionTime":"2025-10-04T02:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.025426 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.025531 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.025550 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.025577 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.025595 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:50Z","lastTransitionTime":"2025-10-04T02:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.123497 4964 generic.go:334] "Generic (PLEG): container finished" podID="e36b0a0d-d6be-4917-a161-26245a74904a" containerID="0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b" exitCode=0 Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.123610 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" event={"ID":"e36b0a0d-d6be-4917-a161-26245a74904a","Type":"ContainerDied","Data":"0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b"} Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.127995 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.128060 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.128082 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.128111 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.128138 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:50Z","lastTransitionTime":"2025-10-04T02:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.142988 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.156744 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.174050 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.192328 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.225451 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.232750 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.232809 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.232828 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.232853 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.232870 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:50Z","lastTransitionTime":"2025-10-04T02:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.246186 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.260377 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.273374 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.288396 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.302260 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.318193 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.336869 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.338935 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.338991 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.339005 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.339025 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.339039 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:50Z","lastTransitionTime":"2025-10-04T02:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.353307 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.372879 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.442924 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.443304 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.443318 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.443339 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.443354 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:50Z","lastTransitionTime":"2025-10-04T02:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.546253 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.546290 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.546301 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.546318 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.546327 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:50Z","lastTransitionTime":"2025-10-04T02:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.649442 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.649498 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.649510 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.649528 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.649541 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:50Z","lastTransitionTime":"2025-10-04T02:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.752482 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.752557 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.752576 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.752607 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.752662 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:50Z","lastTransitionTime":"2025-10-04T02:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.844796 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.844854 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:50 crc kubenswrapper[4964]: E1004 02:40:50.845083 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:40:50 crc kubenswrapper[4964]: E1004 02:40:50.845292 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.854996 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.855054 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.855072 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.855097 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.855115 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:50Z","lastTransitionTime":"2025-10-04T02:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.868577 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.888061 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.907923 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.925138 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.939786 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.959671 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.959734 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.959754 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.959780 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.959799 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:50Z","lastTransitionTime":"2025-10-04T02:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.962565 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:50 crc kubenswrapper[4964]: I1004 02:40:50.990117 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.010359 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.033781 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.056701 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.061563 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.061602 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.061628 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.061645 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.061655 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:51Z","lastTransitionTime":"2025-10-04T02:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.075849 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.089144 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.102047 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.116596 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.130597 4964 generic.go:334] "Generic (PLEG): container finished" podID="e36b0a0d-d6be-4917-a161-26245a74904a" containerID="be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb" exitCode=0 Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.130645 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" event={"ID":"e36b0a0d-d6be-4917-a161-26245a74904a","Type":"ContainerDied","Data":"be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb"} Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.147568 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.159751 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.167161 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.167198 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.167211 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.167231 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.167246 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:51Z","lastTransitionTime":"2025-10-04T02:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.175699 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.191015 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.203423 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.218292 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.265936 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.270672 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.270699 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.270707 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.270720 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.270730 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:51Z","lastTransitionTime":"2025-10-04T02:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.281807 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.294444 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.309685 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.326092 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.340191 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.355189 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.374243 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.374275 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.374284 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.374299 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.374310 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:51Z","lastTransitionTime":"2025-10-04T02:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.375939 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.477825 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.477888 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.477899 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.477954 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.477971 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:51Z","lastTransitionTime":"2025-10-04T02:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.582458 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.582873 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.582909 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.582945 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.582967 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:51Z","lastTransitionTime":"2025-10-04T02:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.686060 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.686122 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.686142 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.686166 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.686184 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:51Z","lastTransitionTime":"2025-10-04T02:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.789339 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.789454 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.789473 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.789503 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.789523 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:51Z","lastTransitionTime":"2025-10-04T02:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.845104 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:51 crc kubenswrapper[4964]: E1004 02:40:51.845701 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.893255 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.893326 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.893349 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.893382 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.893404 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:51Z","lastTransitionTime":"2025-10-04T02:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.995905 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.995950 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.995958 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.995972 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:51 crc kubenswrapper[4964]: I1004 02:40:51.995982 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:51Z","lastTransitionTime":"2025-10-04T02:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.099190 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.099257 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.099274 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.099297 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.099315 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:52Z","lastTransitionTime":"2025-10-04T02:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.140770 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" event={"ID":"e36b0a0d-d6be-4917-a161-26245a74904a","Type":"ContainerStarted","Data":"7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d"} Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.144233 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/0.log" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.149122 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16" exitCode=1 Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.149183 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16"} Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.150365 4964 scope.go:117] "RemoveContainer" containerID="cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.161228 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.177938 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.200857 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.202083 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.202148 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.202166 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.202191 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.202208 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:52Z","lastTransitionTime":"2025-10-04T02:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.231814 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.253012 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.272883 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.289841 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.305095 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.305158 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.305177 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.305200 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.305218 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:52Z","lastTransitionTime":"2025-10-04T02:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.310458 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.329916 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.356071 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.376080 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.397218 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.416271 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.416356 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.416380 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.416411 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.416434 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:52Z","lastTransitionTime":"2025-10-04T02:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.425852 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.447550 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.464382 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.482626 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.498642 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"message\\\":\\\"51.206672 6187 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.214953 6187 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1004 02:40:51.214993 6187 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1004 02:40:51.206758 6187 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.215033 6187 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1004 02:40:51.215049 6187 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1004 02:40:51.215137 6187 handler.go:208] Removed *v1.Node event handler 2\\\\nI1004 02:40:51.206792 6187 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.206722 6187 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.208414 6187 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1004 02:40:51.206827 6187 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.217711 6187 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.515807 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.518907 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.518964 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.518983 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.519006 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.519025 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:52Z","lastTransitionTime":"2025-10-04T02:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.532845 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.547032 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.564826 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.577184 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.596840 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.615504 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.628701 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.628739 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.628758 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.628783 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.628803 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:52Z","lastTransitionTime":"2025-10-04T02:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.635383 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.653309 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.670652 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.684546 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:52Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.731873 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.731953 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.731978 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.732010 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.732033 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:52Z","lastTransitionTime":"2025-10-04T02:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.837796 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.837863 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.837886 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.837916 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.837935 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:52Z","lastTransitionTime":"2025-10-04T02:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.845237 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:52 crc kubenswrapper[4964]: E1004 02:40:52.845419 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.845839 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:52 crc kubenswrapper[4964]: E1004 02:40:52.846076 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.940823 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.940874 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.940890 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.940914 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:52 crc kubenswrapper[4964]: I1004 02:40:52.940933 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:52Z","lastTransitionTime":"2025-10-04T02:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.042829 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.042881 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.042892 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.042911 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.042924 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:53Z","lastTransitionTime":"2025-10-04T02:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.145259 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.145329 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.145339 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.145352 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.145362 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:53Z","lastTransitionTime":"2025-10-04T02:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.154437 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/0.log" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.157563 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a"} Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.157983 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.172138 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.183869 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.201540 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.217183 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.228749 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.245526 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.247527 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.247580 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.247600 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.247664 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.247683 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:53Z","lastTransitionTime":"2025-10-04T02:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.274811 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"message\\\":\\\"51.206672 6187 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.214953 6187 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1004 02:40:51.214993 6187 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1004 02:40:51.206758 6187 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.215033 6187 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1004 02:40:51.215049 6187 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1004 02:40:51.215137 6187 handler.go:208] Removed *v1.Node event handler 2\\\\nI1004 02:40:51.206792 6187 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.206722 6187 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.208414 6187 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1004 02:40:51.206827 6187 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.217711 6187 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.292579 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.309264 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.320729 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.334766 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.350180 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.350226 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.350237 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.350257 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.350272 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:53Z","lastTransitionTime":"2025-10-04T02:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.350939 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.369829 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.387453 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.447898 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.451899 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.451937 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.451948 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.451967 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.451979 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:53Z","lastTransitionTime":"2025-10-04T02:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.460039 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.472896 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.486197 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.496235 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.510103 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.529834 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"message\\\":\\\"51.206672 6187 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.214953 6187 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1004 02:40:51.214993 6187 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1004 02:40:51.206758 6187 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.215033 6187 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1004 02:40:51.215049 6187 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1004 02:40:51.215137 6187 handler.go:208] Removed *v1.Node event handler 2\\\\nI1004 02:40:51.206792 6187 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.206722 6187 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.208414 6187 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1004 02:40:51.206827 6187 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.217711 6187 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.544240 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.553731 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.553767 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.553778 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.553793 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.553804 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:53Z","lastTransitionTime":"2025-10-04T02:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.556817 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.569117 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.581754 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.595062 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.609280 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.621516 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.636449 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:53Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.656385 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.656708 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.656870 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.657043 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.657166 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:53Z","lastTransitionTime":"2025-10-04T02:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.759501 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.759880 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.760063 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.760200 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.760313 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:53Z","lastTransitionTime":"2025-10-04T02:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.844874 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:53 crc kubenswrapper[4964]: E1004 02:40:53.845315 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.863493 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.863560 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.863578 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.863603 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.863650 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:53Z","lastTransitionTime":"2025-10-04T02:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.966730 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.966787 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.966805 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.966828 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:53 crc kubenswrapper[4964]: I1004 02:40:53.966844 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:53Z","lastTransitionTime":"2025-10-04T02:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.069915 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.069976 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.069993 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.070021 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.070043 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:54Z","lastTransitionTime":"2025-10-04T02:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.164393 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/1.log" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.165983 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/0.log" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.170466 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a" exitCode=1 Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.170525 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a"} Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.170572 4964 scope.go:117] "RemoveContainer" containerID="cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.171913 4964 scope.go:117] "RemoveContainer" containerID="22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a" Oct 04 02:40:54 crc kubenswrapper[4964]: E1004 02:40:54.172195 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.174774 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.174834 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.174851 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.174875 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.174891 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:54Z","lastTransitionTime":"2025-10-04T02:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.193922 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.213915 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.235046 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.253787 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.271486 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.277324 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.277386 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.277404 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.277427 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.277446 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:54Z","lastTransitionTime":"2025-10-04T02:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.289760 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.312097 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.330470 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.345090 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.362934 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.379825 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.379856 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.379867 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.379884 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.379896 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:54Z","lastTransitionTime":"2025-10-04T02:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.382718 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.398250 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.412421 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.441173 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"message\\\":\\\"51.206672 6187 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.214953 6187 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1004 02:40:51.214993 6187 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1004 02:40:51.206758 6187 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.215033 6187 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1004 02:40:51.215049 6187 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1004 02:40:51.215137 6187 handler.go:208] Removed *v1.Node event handler 2\\\\nI1004 02:40:51.206792 6187 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.206722 6187 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.208414 6187 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1004 02:40:51.206827 6187 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.217711 6187 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"message\\\":\\\":53.371703 6397 services_controller.go:443] Built service openshift-console/console LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.194\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1004 02:40:53.371746 6397 services_controller.go:444] Built service openshift-console/console LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371795 6397 services_controller.go:445] Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371828 6397 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:54Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.483262 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.483316 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.483333 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.483356 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.483374 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:54Z","lastTransitionTime":"2025-10-04T02:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.586077 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.586130 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.586146 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.586167 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.586182 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:54Z","lastTransitionTime":"2025-10-04T02:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.689430 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.689504 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.689527 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.689556 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.689577 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:54Z","lastTransitionTime":"2025-10-04T02:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.792754 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.792824 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.792842 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.792866 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.792884 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:54Z","lastTransitionTime":"2025-10-04T02:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.848763 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.848789 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:54 crc kubenswrapper[4964]: E1004 02:40:54.849044 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:40:54 crc kubenswrapper[4964]: E1004 02:40:54.849182 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.895848 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.895920 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.895941 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.895970 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.895992 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:54Z","lastTransitionTime":"2025-10-04T02:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.980115 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h"] Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.980896 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.983557 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.984105 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.998739 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.998810 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.998833 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.998864 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:54 crc kubenswrapper[4964]: I1004 02:40:54.998884 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:54Z","lastTransitionTime":"2025-10-04T02:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.004775 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.025477 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.025647 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pldrn\" (UniqueName: \"kubernetes.io/projected/d9bbac90-6a71-48f5-8524-799b00786492-kube-api-access-pldrn\") pod \"ovnkube-control-plane-749d76644c-9vc9h\" (UID: \"d9bbac90-6a71-48f5-8524-799b00786492\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.025839 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9bbac90-6a71-48f5-8524-799b00786492-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9vc9h\" (UID: \"d9bbac90-6a71-48f5-8524-799b00786492\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.025902 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9bbac90-6a71-48f5-8524-799b00786492-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9vc9h\" (UID: \"d9bbac90-6a71-48f5-8524-799b00786492\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.025954 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9bbac90-6a71-48f5-8524-799b00786492-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9vc9h\" (UID: \"d9bbac90-6a71-48f5-8524-799b00786492\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.044686 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.076581 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfd3d88e4613c6c26d75707e70d2617be29caee83769510e2d34590c27dc8e16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"message\\\":\\\"51.206672 6187 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.214953 6187 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1004 02:40:51.214993 6187 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1004 02:40:51.206758 6187 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.215033 6187 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1004 02:40:51.215049 6187 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1004 02:40:51.215137 6187 handler.go:208] Removed *v1.Node event handler 2\\\\nI1004 02:40:51.206792 6187 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.206722 6187 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.208414 6187 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1004 02:40:51.206827 6187 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:40:51.217711 6187 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"message\\\":\\\":53.371703 6397 services_controller.go:443] Built service openshift-console/console LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.194\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1004 02:40:53.371746 6397 services_controller.go:444] Built service openshift-console/console LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371795 6397 services_controller.go:445] Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371828 6397 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.099837 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.101597 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.101671 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.101689 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.101710 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.101727 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:55Z","lastTransitionTime":"2025-10-04T02:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.120303 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.127706 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pldrn\" (UniqueName: \"kubernetes.io/projected/d9bbac90-6a71-48f5-8524-799b00786492-kube-api-access-pldrn\") pod \"ovnkube-control-plane-749d76644c-9vc9h\" (UID: \"d9bbac90-6a71-48f5-8524-799b00786492\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.127800 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9bbac90-6a71-48f5-8524-799b00786492-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9vc9h\" (UID: \"d9bbac90-6a71-48f5-8524-799b00786492\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.127846 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9bbac90-6a71-48f5-8524-799b00786492-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9vc9h\" (UID: \"d9bbac90-6a71-48f5-8524-799b00786492\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.127893 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9bbac90-6a71-48f5-8524-799b00786492-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9vc9h\" (UID: \"d9bbac90-6a71-48f5-8524-799b00786492\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.128947 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9bbac90-6a71-48f5-8524-799b00786492-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9vc9h\" (UID: \"d9bbac90-6a71-48f5-8524-799b00786492\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.128986 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9bbac90-6a71-48f5-8524-799b00786492-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9vc9h\" (UID: \"d9bbac90-6a71-48f5-8524-799b00786492\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.138647 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9bbac90-6a71-48f5-8524-799b00786492-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9vc9h\" (UID: \"d9bbac90-6a71-48f5-8524-799b00786492\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.140597 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.160946 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.163210 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pldrn\" (UniqueName: \"kubernetes.io/projected/d9bbac90-6a71-48f5-8524-799b00786492-kube-api-access-pldrn\") pod \"ovnkube-control-plane-749d76644c-9vc9h\" (UID: \"d9bbac90-6a71-48f5-8524-799b00786492\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.178083 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.179006 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/1.log" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.185361 4964 scope.go:117] "RemoveContainer" containerID="22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a" Oct 04 02:40:55 crc kubenswrapper[4964]: E1004 02:40:55.185645 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.198706 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.204286 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.204333 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.204350 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.204372 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.204389 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:55Z","lastTransitionTime":"2025-10-04T02:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.221332 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.241854 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.255226 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.271078 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.287232 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.303262 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.308156 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.308222 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.308246 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.308277 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.308299 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:55Z","lastTransitionTime":"2025-10-04T02:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.310361 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.324176 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.358737 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.376503 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.397592 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.411372 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.411412 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.411430 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.411452 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.411470 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:55Z","lastTransitionTime":"2025-10-04T02:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.420272 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.435016 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.450117 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.461120 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.478869 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.493458 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.514797 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.514847 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.514863 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.514886 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.514904 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:55Z","lastTransitionTime":"2025-10-04T02:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.524503 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"message\\\":\\\":53.371703 6397 services_controller.go:443] Built service openshift-console/console LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.194\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1004 02:40:53.371746 6397 services_controller.go:444] Built service openshift-console/console LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371795 6397 services_controller.go:445] Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371828 6397 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.547972 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.569589 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.589947 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:55Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.617877 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.617917 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.617926 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.617941 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.617950 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:55Z","lastTransitionTime":"2025-10-04T02:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.720028 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.720093 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.720112 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.720137 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.720154 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:55Z","lastTransitionTime":"2025-10-04T02:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.822715 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.822755 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.822766 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.822783 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.822796 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:55Z","lastTransitionTime":"2025-10-04T02:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.844687 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:55 crc kubenswrapper[4964]: E1004 02:40:55.844880 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.925420 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.925480 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.925496 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.925522 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:55 crc kubenswrapper[4964]: I1004 02:40:55.925543 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:55Z","lastTransitionTime":"2025-10-04T02:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.036729 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.036787 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.036807 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.036830 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.036850 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:56Z","lastTransitionTime":"2025-10-04T02:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.140121 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.140202 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.140221 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.140244 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.140263 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:56Z","lastTransitionTime":"2025-10-04T02:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.190916 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" event={"ID":"d9bbac90-6a71-48f5-8524-799b00786492","Type":"ContainerStarted","Data":"afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.191003 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" event={"ID":"d9bbac90-6a71-48f5-8524-799b00786492","Type":"ContainerStarted","Data":"ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.191028 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" event={"ID":"d9bbac90-6a71-48f5-8524-799b00786492","Type":"ContainerStarted","Data":"5e580bb017196c1eb17630208ecbb565345d2d0e39f28b8243a2b1d06fb7f0f0"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.212675 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.234218 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.243412 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.243468 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.243489 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.243513 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.243531 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:56Z","lastTransitionTime":"2025-10-04T02:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.257754 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.281595 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.301838 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.321667 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.346174 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.346253 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.346274 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.346300 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.346318 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:56Z","lastTransitionTime":"2025-10-04T02:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.348190 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.361916 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.379753 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.392555 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.410379 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.430508 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"message\\\":\\\":53.371703 6397 services_controller.go:443] Built service openshift-console/console LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.194\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1004 02:40:53.371746 6397 services_controller.go:444] Built service openshift-console/console LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371795 6397 services_controller.go:445] Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371828 6397 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.446823 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.449302 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.449382 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.449407 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.449440 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.449464 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:56Z","lastTransitionTime":"2025-10-04T02:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.466402 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.487034 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.545507 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.545760 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.545819 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:41:12.545779902 +0000 UTC m=+52.442738580 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.545907 4964 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.545946 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.546001 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:41:12.545973247 +0000 UTC m=+52.442931925 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.546124 4964 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.546187 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:41:12.546173172 +0000 UTC m=+52.443131840 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.553097 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.553159 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.553181 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.553211 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.553235 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:56Z","lastTransitionTime":"2025-10-04T02:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.646471 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.646541 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.646723 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.646745 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.646758 4964 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.646811 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 02:41:12.64679471 +0000 UTC m=+52.543753438 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.647183 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.647202 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.647212 4964 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.647239 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 02:41:12.647230391 +0000 UTC m=+52.544189029 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.655249 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.655286 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.655299 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.655316 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.655328 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:56Z","lastTransitionTime":"2025-10-04T02:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.758432 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.758468 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.758479 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.758497 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.758508 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:56Z","lastTransitionTime":"2025-10-04T02:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.844958 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.845120 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.845177 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.845322 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.861074 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.861135 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.861154 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.861181 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.861200 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:56Z","lastTransitionTime":"2025-10-04T02:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.876603 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-xrr6r"] Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.877534 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:40:56 crc kubenswrapper[4964]: E1004 02:40:56.877707 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.897660 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.915261 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.931602 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.949337 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nbzx\" (UniqueName: \"kubernetes.io/projected/7f1c9150-b444-41bb-9233-d76c4765a2d0-kube-api-access-5nbzx\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.949409 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.949700 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.963902 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.963947 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.963958 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.963975 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.963988 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:56Z","lastTransitionTime":"2025-10-04T02:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.970831 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:56 crc kubenswrapper[4964]: I1004 02:40:56.990158 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:56Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.010468 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.027417 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.045094 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.051073 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nbzx\" (UniqueName: \"kubernetes.io/projected/7f1c9150-b444-41bb-9233-d76c4765a2d0-kube-api-access-5nbzx\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.051133 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:40:57 crc kubenswrapper[4964]: E1004 02:40:57.051324 4964 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:40:57 crc kubenswrapper[4964]: E1004 02:40:57.051426 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs podName:7f1c9150-b444-41bb-9233-d76c4765a2d0 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:57.551397537 +0000 UTC m=+37.448356205 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs") pod "network-metrics-daemon-xrr6r" (UID: "7f1c9150-b444-41bb-9233-d76c4765a2d0") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.063127 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.068463 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.068532 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.068555 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.068584 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.068606 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.080760 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nbzx\" (UniqueName: \"kubernetes.io/projected/7f1c9150-b444-41bb-9233-d76c4765a2d0-kube-api-access-5nbzx\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.088581 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.106769 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.126364 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.162586 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"message\\\":\\\":53.371703 6397 services_controller.go:443] Built service openshift-console/console LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.194\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1004 02:40:53.371746 6397 services_controller.go:444] Built service openshift-console/console LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371795 6397 services_controller.go:445] Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371828 6397 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.171522 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.171576 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.171593 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.171660 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.171685 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.184788 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.204694 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.274497 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.274817 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.274947 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.275074 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.275185 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.292267 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.292309 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.292325 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.292342 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.292356 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: E1004 02:40:57.311958 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.316836 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.316897 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.316919 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.316946 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.316967 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: E1004 02:40:57.338144 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.343506 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.343559 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.343576 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.343599 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.343649 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: E1004 02:40:57.362018 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.367489 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.367543 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.367567 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.367602 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.367670 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: E1004 02:40:57.387047 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.392377 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.392431 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.392453 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.392484 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.392506 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: E1004 02:40:57.405919 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:40:57Z is after 2025-08-24T17:21:41Z" Oct 04 02:40:57 crc kubenswrapper[4964]: E1004 02:40:57.406024 4964 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.407575 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.407596 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.407603 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.407633 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.407643 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.510441 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.510525 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.510551 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.510581 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.510604 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.556041 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:40:57 crc kubenswrapper[4964]: E1004 02:40:57.556207 4964 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:40:57 crc kubenswrapper[4964]: E1004 02:40:57.556288 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs podName:7f1c9150-b444-41bb-9233-d76c4765a2d0 nodeName:}" failed. No retries permitted until 2025-10-04 02:40:58.556265722 +0000 UTC m=+38.453224380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs") pod "network-metrics-daemon-xrr6r" (UID: "7f1c9150-b444-41bb-9233-d76c4765a2d0") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.613216 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.613279 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.613302 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.613330 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.613350 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.716124 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.716184 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.716202 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.716228 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.716250 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.818842 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.819122 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.819135 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.819155 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.819166 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.845116 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:57 crc kubenswrapper[4964]: E1004 02:40:57.845322 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.921806 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.921852 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.921864 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.921883 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:57 crc kubenswrapper[4964]: I1004 02:40:57.921897 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:57Z","lastTransitionTime":"2025-10-04T02:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.024931 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.024985 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.024999 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.025018 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.025032 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:58Z","lastTransitionTime":"2025-10-04T02:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.127686 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.127722 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.127733 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.127749 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.127760 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:58Z","lastTransitionTime":"2025-10-04T02:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.235611 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.235714 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.235732 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.235758 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.235776 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:58Z","lastTransitionTime":"2025-10-04T02:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.337888 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.337947 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.337965 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.337989 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.338008 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:58Z","lastTransitionTime":"2025-10-04T02:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.440858 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.440899 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.440910 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.440926 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.440938 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:58Z","lastTransitionTime":"2025-10-04T02:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.544411 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.544472 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.544484 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.544498 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.544525 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:58Z","lastTransitionTime":"2025-10-04T02:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.568493 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:40:58 crc kubenswrapper[4964]: E1004 02:40:58.568696 4964 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:40:58 crc kubenswrapper[4964]: E1004 02:40:58.568778 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs podName:7f1c9150-b444-41bb-9233-d76c4765a2d0 nodeName:}" failed. No retries permitted until 2025-10-04 02:41:00.568753402 +0000 UTC m=+40.465712080 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs") pod "network-metrics-daemon-xrr6r" (UID: "7f1c9150-b444-41bb-9233-d76c4765a2d0") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.649075 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.649144 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.649161 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.649179 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.649192 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:58Z","lastTransitionTime":"2025-10-04T02:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.751983 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.752053 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.752072 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.752095 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.752113 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:58Z","lastTransitionTime":"2025-10-04T02:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.845345 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.845414 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:40:58 crc kubenswrapper[4964]: E1004 02:40:58.845541 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.845555 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:40:58 crc kubenswrapper[4964]: E1004 02:40:58.845779 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:40:58 crc kubenswrapper[4964]: E1004 02:40:58.845908 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.855039 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.855091 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.855107 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.855128 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.855146 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:58Z","lastTransitionTime":"2025-10-04T02:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.957449 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.957501 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.957528 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.957550 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:58 crc kubenswrapper[4964]: I1004 02:40:58.957569 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:58Z","lastTransitionTime":"2025-10-04T02:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.060319 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.060449 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.060461 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.060478 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.060490 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:59Z","lastTransitionTime":"2025-10-04T02:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.162471 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.162544 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.162567 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.162603 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.162664 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:59Z","lastTransitionTime":"2025-10-04T02:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.265497 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.265558 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.265573 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.265593 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.265605 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:59Z","lastTransitionTime":"2025-10-04T02:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.368344 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.368410 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.368427 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.368453 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.368481 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:59Z","lastTransitionTime":"2025-10-04T02:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.471104 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.471144 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.471152 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.471166 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.471176 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:59Z","lastTransitionTime":"2025-10-04T02:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.573363 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.573428 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.573449 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.573478 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.573500 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:59Z","lastTransitionTime":"2025-10-04T02:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.676312 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.676383 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.676402 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.676427 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.676448 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:59Z","lastTransitionTime":"2025-10-04T02:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.778741 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.778809 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.778834 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.778859 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.778876 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:59Z","lastTransitionTime":"2025-10-04T02:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.844249 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:40:59 crc kubenswrapper[4964]: E1004 02:40:59.844430 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.881685 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.881716 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.881729 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.881748 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.881762 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:59Z","lastTransitionTime":"2025-10-04T02:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.984157 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.984219 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.984246 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.984278 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:40:59 crc kubenswrapper[4964]: I1004 02:40:59.984299 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:40:59Z","lastTransitionTime":"2025-10-04T02:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.087337 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.087390 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.087408 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.087425 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.087437 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:00Z","lastTransitionTime":"2025-10-04T02:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.190320 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.190395 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.190413 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.190438 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.190455 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:00Z","lastTransitionTime":"2025-10-04T02:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.293158 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.293198 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.293209 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.293225 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.293236 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:00Z","lastTransitionTime":"2025-10-04T02:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.395811 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.395897 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.395923 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.395952 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.395976 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:00Z","lastTransitionTime":"2025-10-04T02:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.498298 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.498348 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.498364 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.498386 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.498403 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:00Z","lastTransitionTime":"2025-10-04T02:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.586704 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:00 crc kubenswrapper[4964]: E1004 02:41:00.586923 4964 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:41:00 crc kubenswrapper[4964]: E1004 02:41:00.587043 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs podName:7f1c9150-b444-41bb-9233-d76c4765a2d0 nodeName:}" failed. No retries permitted until 2025-10-04 02:41:04.587014394 +0000 UTC m=+44.483973072 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs") pod "network-metrics-daemon-xrr6r" (UID: "7f1c9150-b444-41bb-9233-d76c4765a2d0") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.600438 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.600480 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.600495 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.600515 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.600531 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:00Z","lastTransitionTime":"2025-10-04T02:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.703853 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.703914 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.703932 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.703962 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.703982 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:00Z","lastTransitionTime":"2025-10-04T02:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.805956 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.806021 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.806038 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.806064 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.806081 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:00Z","lastTransitionTime":"2025-10-04T02:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.844298 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.844362 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.844445 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:00 crc kubenswrapper[4964]: E1004 02:41:00.844452 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:00 crc kubenswrapper[4964]: E1004 02:41:00.844575 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:00 crc kubenswrapper[4964]: E1004 02:41:00.844822 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.860880 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:00Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.878137 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:00Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.903076 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:00Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.909030 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.909100 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.909120 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.909145 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.909165 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:00Z","lastTransitionTime":"2025-10-04T02:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.916992 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:00Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.930721 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:00Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.954191 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:00Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.973258 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:00Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.987476 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:00Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:00 crc kubenswrapper[4964]: I1004 02:41:00.998051 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:00Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.012354 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.012399 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.012415 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.012435 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.012449 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:01Z","lastTransitionTime":"2025-10-04T02:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.012315 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:01Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.025741 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:01Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.042958 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:01Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.074378 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"message\\\":\\\":53.371703 6397 services_controller.go:443] Built service openshift-console/console LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.194\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1004 02:40:53.371746 6397 services_controller.go:444] Built service openshift-console/console LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371795 6397 services_controller.go:445] Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371828 6397 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:01Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.098191 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:01Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.115226 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.115260 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.115270 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.115286 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.115297 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:01Z","lastTransitionTime":"2025-10-04T02:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.119519 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:01Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.134849 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:01Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.217457 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.217494 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.217509 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.217527 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.217540 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:01Z","lastTransitionTime":"2025-10-04T02:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.322523 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.322594 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.322659 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.322688 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.322707 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:01Z","lastTransitionTime":"2025-10-04T02:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.426743 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.426820 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.426837 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.426870 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.426889 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:01Z","lastTransitionTime":"2025-10-04T02:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.530739 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.530807 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.530827 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.530879 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.530897 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:01Z","lastTransitionTime":"2025-10-04T02:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.634328 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.634392 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.634411 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.634440 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.634458 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:01Z","lastTransitionTime":"2025-10-04T02:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.739113 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.739190 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.739208 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.739234 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.739254 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:01Z","lastTransitionTime":"2025-10-04T02:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.842475 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.842540 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.842558 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.842592 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.842654 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:01Z","lastTransitionTime":"2025-10-04T02:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.844930 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:01 crc kubenswrapper[4964]: E1004 02:41:01.845174 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.946251 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.946323 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.946343 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.946371 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:01 crc kubenswrapper[4964]: I1004 02:41:01.946390 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:01Z","lastTransitionTime":"2025-10-04T02:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.049499 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.049552 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.049570 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.049592 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.049608 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:02Z","lastTransitionTime":"2025-10-04T02:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.152176 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.152267 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.152295 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.152328 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.152356 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:02Z","lastTransitionTime":"2025-10-04T02:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.256096 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.256169 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.256190 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.256222 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.256246 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:02Z","lastTransitionTime":"2025-10-04T02:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.360039 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.360124 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.360147 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.360178 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.360207 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:02Z","lastTransitionTime":"2025-10-04T02:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.463124 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.463180 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.463197 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.463221 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.463238 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:02Z","lastTransitionTime":"2025-10-04T02:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.566767 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.566817 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.566833 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.566857 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.566873 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:02Z","lastTransitionTime":"2025-10-04T02:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.670891 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.670956 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.670972 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.670996 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.671014 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:02Z","lastTransitionTime":"2025-10-04T02:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.773794 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.773855 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.773873 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.773897 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.773914 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:02Z","lastTransitionTime":"2025-10-04T02:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.845051 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.845091 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:02 crc kubenswrapper[4964]: E1004 02:41:02.845266 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.845334 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:02 crc kubenswrapper[4964]: E1004 02:41:02.845532 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:02 crc kubenswrapper[4964]: E1004 02:41:02.845887 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.882103 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.882183 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.882210 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.882241 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.882263 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:02Z","lastTransitionTime":"2025-10-04T02:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.985676 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.985748 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.985766 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.985794 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:02 crc kubenswrapper[4964]: I1004 02:41:02.985815 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:02Z","lastTransitionTime":"2025-10-04T02:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.089022 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.089095 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.089118 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.089147 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.089167 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:03Z","lastTransitionTime":"2025-10-04T02:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.191756 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.191826 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.191845 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.191867 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.191885 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:03Z","lastTransitionTime":"2025-10-04T02:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.293676 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.293730 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.293748 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.293773 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.293791 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:03Z","lastTransitionTime":"2025-10-04T02:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.396424 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.396455 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.396466 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.396481 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.396492 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:03Z","lastTransitionTime":"2025-10-04T02:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.499410 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.499475 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.499494 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.499519 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.499537 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:03Z","lastTransitionTime":"2025-10-04T02:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.602798 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.602840 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.602850 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.602866 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.602877 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:03Z","lastTransitionTime":"2025-10-04T02:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.705704 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.705814 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.705841 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.705872 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.705892 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:03Z","lastTransitionTime":"2025-10-04T02:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.809414 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.809827 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.809989 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.810189 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.810368 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:03Z","lastTransitionTime":"2025-10-04T02:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.845087 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:03 crc kubenswrapper[4964]: E1004 02:41:03.845486 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.913302 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.913345 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.913354 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.913374 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:03 crc kubenswrapper[4964]: I1004 02:41:03.913386 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:03Z","lastTransitionTime":"2025-10-04T02:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.016033 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.016444 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.016599 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.016807 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.016952 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:04Z","lastTransitionTime":"2025-10-04T02:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.119238 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.119573 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.119754 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.119882 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.120027 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:04Z","lastTransitionTime":"2025-10-04T02:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.222348 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.222422 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.222445 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.222474 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.222498 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:04Z","lastTransitionTime":"2025-10-04T02:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.325023 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.325100 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.325122 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.325151 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.325173 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:04Z","lastTransitionTime":"2025-10-04T02:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.428121 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.428222 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.428247 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.428276 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.428299 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:04Z","lastTransitionTime":"2025-10-04T02:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.531200 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.531583 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.531794 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.531939 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.532071 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:04Z","lastTransitionTime":"2025-10-04T02:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.632504 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:04 crc kubenswrapper[4964]: E1004 02:41:04.632779 4964 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:41:04 crc kubenswrapper[4964]: E1004 02:41:04.632890 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs podName:7f1c9150-b444-41bb-9233-d76c4765a2d0 nodeName:}" failed. No retries permitted until 2025-10-04 02:41:12.63285781 +0000 UTC m=+52.529816488 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs") pod "network-metrics-daemon-xrr6r" (UID: "7f1c9150-b444-41bb-9233-d76c4765a2d0") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.634599 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.634746 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.634783 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.634813 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.634833 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:04Z","lastTransitionTime":"2025-10-04T02:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.737946 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.738000 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.738016 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.738078 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.738126 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:04Z","lastTransitionTime":"2025-10-04T02:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.841252 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.841314 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.841330 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.841355 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.841376 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:04Z","lastTransitionTime":"2025-10-04T02:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.845040 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:04 crc kubenswrapper[4964]: E1004 02:41:04.845244 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.845333 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:04 crc kubenswrapper[4964]: E1004 02:41:04.845546 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.845740 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:04 crc kubenswrapper[4964]: E1004 02:41:04.846039 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.948763 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.948856 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.948886 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.949691 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:04 crc kubenswrapper[4964]: I1004 02:41:04.949757 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:04Z","lastTransitionTime":"2025-10-04T02:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.053087 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.053155 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.053174 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.053200 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.053219 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:05Z","lastTransitionTime":"2025-10-04T02:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.156552 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.156670 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.156692 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.156717 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.156734 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:05Z","lastTransitionTime":"2025-10-04T02:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.260023 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.260098 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.260114 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.260142 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.260161 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:05Z","lastTransitionTime":"2025-10-04T02:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.363053 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.363465 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.363717 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.364006 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.364235 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:05Z","lastTransitionTime":"2025-10-04T02:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.467310 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.467982 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.468078 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.468175 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.468255 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:05Z","lastTransitionTime":"2025-10-04T02:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.571365 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.571442 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.571466 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.571498 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.571518 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:05Z","lastTransitionTime":"2025-10-04T02:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.674367 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.674424 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.674445 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.674469 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.674488 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:05Z","lastTransitionTime":"2025-10-04T02:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.777518 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.777570 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.777588 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.777652 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.777671 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:05Z","lastTransitionTime":"2025-10-04T02:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.844886 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:05 crc kubenswrapper[4964]: E1004 02:41:05.845132 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.880691 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.880770 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.880790 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.880821 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.880842 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:05Z","lastTransitionTime":"2025-10-04T02:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.984101 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.984158 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.984174 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.984199 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:05 crc kubenswrapper[4964]: I1004 02:41:05.984242 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:05Z","lastTransitionTime":"2025-10-04T02:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.087558 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.087659 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.087684 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.087716 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.087735 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:06Z","lastTransitionTime":"2025-10-04T02:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.190754 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.190804 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.190821 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.190878 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.190915 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:06Z","lastTransitionTime":"2025-10-04T02:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.293861 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.293929 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.293947 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.293973 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.293990 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:06Z","lastTransitionTime":"2025-10-04T02:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.396853 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.396910 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.396926 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.396948 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.396964 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:06Z","lastTransitionTime":"2025-10-04T02:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.499922 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.499949 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.499958 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.499989 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.499998 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:06Z","lastTransitionTime":"2025-10-04T02:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.603258 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.603310 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.603329 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.603352 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.603369 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:06Z","lastTransitionTime":"2025-10-04T02:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.706804 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.706858 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.706876 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.706901 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.706918 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:06Z","lastTransitionTime":"2025-10-04T02:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.809716 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.809846 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.809870 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.809894 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.809910 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:06Z","lastTransitionTime":"2025-10-04T02:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.844569 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.844707 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:06 crc kubenswrapper[4964]: E1004 02:41:06.844778 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:06 crc kubenswrapper[4964]: E1004 02:41:06.844886 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.844913 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:06 crc kubenswrapper[4964]: E1004 02:41:06.845689 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.846115 4964 scope.go:117] "RemoveContainer" containerID="22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.913161 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.913214 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.913230 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.913254 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:06 crc kubenswrapper[4964]: I1004 02:41:06.913273 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:06Z","lastTransitionTime":"2025-10-04T02:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.016826 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.017201 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.017225 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.017250 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.017268 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.123400 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.123469 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.123497 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.123524 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.123542 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.227943 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.228005 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.228025 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.228049 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.228067 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.238143 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/1.log" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.243102 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.244056 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.260728 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.277578 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.296234 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.317667 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.332471 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.332527 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.332539 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.332562 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.332574 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.346169 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.368207 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.388747 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.419137 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"message\\\":\\\":53.371703 6397 services_controller.go:443] Built service openshift-console/console LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.194\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1004 02:40:53.371746 6397 services_controller.go:444] Built service openshift-console/console LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371795 6397 services_controller.go:445] Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371828 6397 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.435688 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.435730 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.435741 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.435758 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.435769 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.442587 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.461438 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.480314 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.496585 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.501918 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.501964 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.501975 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.502004 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.502016 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.511267 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: E1004 02:41:07.516322 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.519535 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.519574 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.519584 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.519650 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.519664 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.524520 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: E1004 02:41:07.532906 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.536732 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.536914 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.537027 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.537132 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.537260 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.537703 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: E1004 02:41:07.552263 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.557458 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.559209 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.559442 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.559739 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.559960 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.560160 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: E1004 02:41:07.575040 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.580231 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.580286 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.580302 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.580324 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.580339 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: E1004 02:41:07.598076 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:07Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:07 crc kubenswrapper[4964]: E1004 02:41:07.598231 4964 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.600658 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.600711 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.600731 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.600756 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.600774 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.704366 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.704424 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.704440 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.704465 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.704483 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.808106 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.808162 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.808181 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.808209 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.808230 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.855738 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:07 crc kubenswrapper[4964]: E1004 02:41:07.855940 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.911282 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.911369 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.911386 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.911413 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:07 crc kubenswrapper[4964]: I1004 02:41:07.911431 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:07Z","lastTransitionTime":"2025-10-04T02:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.015522 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.015584 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.015606 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.015670 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.015693 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:08Z","lastTransitionTime":"2025-10-04T02:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.120007 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.120079 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.120095 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.120117 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.120136 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:08Z","lastTransitionTime":"2025-10-04T02:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.223128 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.223223 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.223240 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.223268 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.223295 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:08Z","lastTransitionTime":"2025-10-04T02:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.249850 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/2.log" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.250846 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/1.log" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.255731 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d" exitCode=1 Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.255784 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d"} Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.255837 4964 scope.go:117] "RemoveContainer" containerID="22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.256973 4964 scope.go:117] "RemoveContainer" containerID="2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d" Oct 04 02:41:08 crc kubenswrapper[4964]: E1004 02:41:08.257436 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.283547 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.301494 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.322454 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.326675 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.326731 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.326748 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.326774 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.326791 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:08Z","lastTransitionTime":"2025-10-04T02:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.340318 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.361254 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.395346 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22385a87da838790b659867b2d15e10610ddf1c7be3dc8c891a6907257d2bb2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"message\\\":\\\":53.371703 6397 services_controller.go:443] Built service openshift-console/console LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.194\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1004 02:40:53.371746 6397 services_controller.go:444] Built service openshift-console/console LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371795 6397 services_controller.go:445] Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI1004 02:40:53.371828 6397 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:08Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:41:07.989751 6606 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1004 02:41:07.989781 6606 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1004 02:41:07.989817 6606 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 02:41:07.989821 6606 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1004 02:41:07.989834 6606 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1004 02:41:07.989857 6606 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1004 02:41:07.989868 6606 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1004 02:41:07.989893 6606 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1004 02:41:07.989902 6606 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1004 02:41:07.989917 6606 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1004 02:41:07.989922 6606 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1004 02:41:07.989940 6606 factory.go:656] Stopping watch factory\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.Node event handler 7\\\\nI1004 02:41:07.989972 6606 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.419120 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.429493 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.429536 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.429546 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.429564 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.429577 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:08Z","lastTransitionTime":"2025-10-04T02:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.441188 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.460806 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.482703 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.504534 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.530519 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.532546 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.532601 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.532647 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.532674 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.532692 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:08Z","lastTransitionTime":"2025-10-04T02:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.550014 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.568221 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.586251 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.607953 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:08Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.635729 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.635789 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.635805 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.635829 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.635846 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:08Z","lastTransitionTime":"2025-10-04T02:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.739235 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.739295 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.739316 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.739346 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.739370 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:08Z","lastTransitionTime":"2025-10-04T02:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.842606 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.842692 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.842707 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.842728 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.842745 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:08Z","lastTransitionTime":"2025-10-04T02:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.845295 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.845323 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:08 crc kubenswrapper[4964]: E1004 02:41:08.845503 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.845533 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:08 crc kubenswrapper[4964]: E1004 02:41:08.845669 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:08 crc kubenswrapper[4964]: E1004 02:41:08.845799 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.945732 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.945781 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.945792 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.945811 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:08 crc kubenswrapper[4964]: I1004 02:41:08.945823 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:08Z","lastTransitionTime":"2025-10-04T02:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.048843 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.048975 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.048994 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.049017 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.049033 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:09Z","lastTransitionTime":"2025-10-04T02:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.152892 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.152958 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.152976 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.153001 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.153019 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:09Z","lastTransitionTime":"2025-10-04T02:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.256335 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.256381 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.256401 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.256422 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.256439 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:09Z","lastTransitionTime":"2025-10-04T02:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.261604 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/2.log" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.267527 4964 scope.go:117] "RemoveContainer" containerID="2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d" Oct 04 02:41:09 crc kubenswrapper[4964]: E1004 02:41:09.267922 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.286604 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.304019 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.322689 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.342243 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.359230 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.359310 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.359333 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.359364 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.359387 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:09Z","lastTransitionTime":"2025-10-04T02:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.361026 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.382701 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.414291 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:08Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:41:07.989751 6606 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1004 02:41:07.989781 6606 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1004 02:41:07.989817 6606 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 02:41:07.989821 6606 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1004 02:41:07.989834 6606 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1004 02:41:07.989857 6606 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1004 02:41:07.989868 6606 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1004 02:41:07.989893 6606 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1004 02:41:07.989902 6606 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1004 02:41:07.989917 6606 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1004 02:41:07.989922 6606 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1004 02:41:07.989940 6606 factory.go:656] Stopping watch factory\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.Node event handler 7\\\\nI1004 02:41:07.989972 6606 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.438524 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.460048 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.462284 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.462345 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.462362 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.462390 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.462407 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:09Z","lastTransitionTime":"2025-10-04T02:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.481477 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.503166 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.523781 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.550257 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.565967 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.566031 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.566048 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.566076 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.566094 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:09Z","lastTransitionTime":"2025-10-04T02:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.569660 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.590837 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.615547 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:09Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.669122 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.669180 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.669197 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.669220 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.669238 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:09Z","lastTransitionTime":"2025-10-04T02:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.771660 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.771712 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.771728 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.771755 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.771772 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:09Z","lastTransitionTime":"2025-10-04T02:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.844820 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:09 crc kubenswrapper[4964]: E1004 02:41:09.845020 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.874691 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.874760 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.874787 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.874814 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.874835 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:09Z","lastTransitionTime":"2025-10-04T02:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.978151 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.978222 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.978239 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.978265 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:09 crc kubenswrapper[4964]: I1004 02:41:09.978282 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:09Z","lastTransitionTime":"2025-10-04T02:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.081892 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.081967 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.081990 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.082019 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.082037 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:10Z","lastTransitionTime":"2025-10-04T02:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.185751 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.185800 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.185816 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.185839 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.185860 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:10Z","lastTransitionTime":"2025-10-04T02:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.289005 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.289072 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.289089 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.289115 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.289145 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:10Z","lastTransitionTime":"2025-10-04T02:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.397761 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.397824 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.397841 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.397865 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.397882 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:10Z","lastTransitionTime":"2025-10-04T02:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.500832 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.500897 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.500913 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.500937 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.500954 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:10Z","lastTransitionTime":"2025-10-04T02:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.604208 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.604270 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.604288 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.604312 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.604330 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:10Z","lastTransitionTime":"2025-10-04T02:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.707195 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.707251 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.707271 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.707301 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.707324 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:10Z","lastTransitionTime":"2025-10-04T02:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.810479 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.810554 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.810577 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.810648 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.810675 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:10Z","lastTransitionTime":"2025-10-04T02:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.845417 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.845445 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:10 crc kubenswrapper[4964]: E1004 02:41:10.845735 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.845775 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:10 crc kubenswrapper[4964]: E1004 02:41:10.845993 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:10 crc kubenswrapper[4964]: E1004 02:41:10.846235 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.869392 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:08Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:41:07.989751 6606 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1004 02:41:07.989781 6606 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1004 02:41:07.989817 6606 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 02:41:07.989821 6606 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1004 02:41:07.989834 6606 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1004 02:41:07.989857 6606 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1004 02:41:07.989868 6606 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1004 02:41:07.989893 6606 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1004 02:41:07.989902 6606 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1004 02:41:07.989917 6606 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1004 02:41:07.989922 6606 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1004 02:41:07.989940 6606 factory.go:656] Stopping watch factory\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.Node event handler 7\\\\nI1004 02:41:07.989972 6606 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:10Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.889994 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:10Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.909286 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:10Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.914064 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.914187 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.914214 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.914243 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.914265 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:10Z","lastTransitionTime":"2025-10-04T02:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.923823 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:10Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.942048 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:10Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.967198 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:10Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:10 crc kubenswrapper[4964]: I1004 02:41:10.987178 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:10Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.004069 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:11Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.017234 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.017328 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.017351 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.017394 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.017495 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:11Z","lastTransitionTime":"2025-10-04T02:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.021882 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:11Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.044170 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:11Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.059662 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:11Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.077073 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:11Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.093918 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:11Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.110944 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:11Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.120861 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.120931 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.120958 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.120989 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.121012 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:11Z","lastTransitionTime":"2025-10-04T02:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.129063 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:11Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.146608 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:11Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.227842 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.227912 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.227932 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.227961 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.227985 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:11Z","lastTransitionTime":"2025-10-04T02:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.331661 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.331751 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.331777 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.331816 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.331842 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:11Z","lastTransitionTime":"2025-10-04T02:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.451963 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.452035 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.452052 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.452079 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.452096 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:11Z","lastTransitionTime":"2025-10-04T02:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.555582 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.555675 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.555692 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.555717 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.555735 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:11Z","lastTransitionTime":"2025-10-04T02:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.660400 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.660464 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.660484 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.660510 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.660527 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:11Z","lastTransitionTime":"2025-10-04T02:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.763964 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.764026 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.764043 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.764069 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.764086 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:11Z","lastTransitionTime":"2025-10-04T02:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.844884 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:11 crc kubenswrapper[4964]: E1004 02:41:11.845053 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.867362 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.867403 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.867414 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.867432 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.867444 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:11Z","lastTransitionTime":"2025-10-04T02:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.971265 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.971312 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.971322 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.971341 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:11 crc kubenswrapper[4964]: I1004 02:41:11.971351 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:11Z","lastTransitionTime":"2025-10-04T02:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.074462 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.074507 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.074518 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.074535 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.074547 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:12Z","lastTransitionTime":"2025-10-04T02:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.177812 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.177879 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.177897 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.177920 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.177937 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:12Z","lastTransitionTime":"2025-10-04T02:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.280455 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.280515 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.280533 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.280556 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.280574 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:12Z","lastTransitionTime":"2025-10-04T02:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.384204 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.384267 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.384284 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.384313 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.384333 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:12Z","lastTransitionTime":"2025-10-04T02:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.487473 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.487536 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.487555 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.487583 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.487602 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:12Z","lastTransitionTime":"2025-10-04T02:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.591018 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.591096 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.591115 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.591141 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.591158 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:12Z","lastTransitionTime":"2025-10-04T02:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.624046 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.624249 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.624318 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:41:44.624279044 +0000 UTC m=+84.521237722 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.624374 4964 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.624464 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:41:44.624439958 +0000 UTC m=+84.521398626 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.624497 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.624680 4964 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.624740 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:41:44.624726395 +0000 UTC m=+84.521685073 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.661488 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.676192 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.680217 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.693877 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.693941 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.693958 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.693981 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.694000 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:12Z","lastTransitionTime":"2025-10-04T02:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.698069 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.715861 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.725864 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.725946 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.725993 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.726133 4964 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.726157 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.726214 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs podName:7f1c9150-b444-41bb-9233-d76c4765a2d0 nodeName:}" failed. No retries permitted until 2025-10-04 02:41:28.726190705 +0000 UTC m=+68.623149373 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs") pod "network-metrics-daemon-xrr6r" (UID: "7f1c9150-b444-41bb-9233-d76c4765a2d0") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.726230 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.726250 4964 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.726339 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 02:41:44.726320258 +0000 UTC m=+84.623278926 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.726188 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.726430 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.726447 4964 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.726537 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 02:41:44.726522953 +0000 UTC m=+84.623481621 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.738024 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.768712 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:08Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:41:07.989751 6606 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1004 02:41:07.989781 6606 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1004 02:41:07.989817 6606 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 02:41:07.989821 6606 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1004 02:41:07.989834 6606 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1004 02:41:07.989857 6606 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1004 02:41:07.989868 6606 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1004 02:41:07.989893 6606 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1004 02:41:07.989902 6606 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1004 02:41:07.989917 6606 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1004 02:41:07.989922 6606 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1004 02:41:07.989940 6606 factory.go:656] Stopping watch factory\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.Node event handler 7\\\\nI1004 02:41:07.989972 6606 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.791836 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.797033 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.797086 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.797103 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.797127 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.797144 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:12Z","lastTransitionTime":"2025-10-04T02:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.812315 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.828222 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.844778 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.844833 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.844811 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.845025 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.845159 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:12 crc kubenswrapper[4964]: E1004 02:41:12.845237 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.851099 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.872835 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.900079 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.900162 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.900186 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.900218 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.900243 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:12Z","lastTransitionTime":"2025-10-04T02:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.903597 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.920315 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.940426 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.961768 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:12 crc kubenswrapper[4964]: I1004 02:41:12.981495 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.000429 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:12Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.002904 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.002956 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.003002 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.003051 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.003069 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:13Z","lastTransitionTime":"2025-10-04T02:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.106346 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.106419 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.106442 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.106474 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.106496 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:13Z","lastTransitionTime":"2025-10-04T02:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.209373 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.209420 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.209432 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.209452 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.209463 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:13Z","lastTransitionTime":"2025-10-04T02:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.311705 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.311766 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.311782 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.311811 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.311830 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:13Z","lastTransitionTime":"2025-10-04T02:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.415271 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.415328 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.415348 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.415375 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.415400 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:13Z","lastTransitionTime":"2025-10-04T02:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.518080 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.518135 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.518163 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.518187 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.518204 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:13Z","lastTransitionTime":"2025-10-04T02:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.624655 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.624716 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.624733 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.624756 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.624774 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:13Z","lastTransitionTime":"2025-10-04T02:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.727776 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.727834 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.727851 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.727873 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.727887 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:13Z","lastTransitionTime":"2025-10-04T02:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.830534 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.830585 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.830601 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.830656 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.830673 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:13Z","lastTransitionTime":"2025-10-04T02:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.844526 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:13 crc kubenswrapper[4964]: E1004 02:41:13.844711 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.933763 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.933822 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.933846 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.933875 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:13 crc kubenswrapper[4964]: I1004 02:41:13.933897 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:13Z","lastTransitionTime":"2025-10-04T02:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.036748 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.036829 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.036848 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.036872 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.036888 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:14Z","lastTransitionTime":"2025-10-04T02:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.139160 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.139205 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.139217 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.139234 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.139243 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:14Z","lastTransitionTime":"2025-10-04T02:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.241909 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.241949 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.241966 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.241992 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.242008 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:14Z","lastTransitionTime":"2025-10-04T02:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.344842 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.344904 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.344922 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.344944 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.344961 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:14Z","lastTransitionTime":"2025-10-04T02:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.447476 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.447523 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.447539 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.447560 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.447575 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:14Z","lastTransitionTime":"2025-10-04T02:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.550223 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.550307 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.550332 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.550365 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.550388 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:14Z","lastTransitionTime":"2025-10-04T02:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.653503 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.653564 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.653580 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.653604 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.653653 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:14Z","lastTransitionTime":"2025-10-04T02:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.756453 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.756510 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.756527 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.756550 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.756567 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:14Z","lastTransitionTime":"2025-10-04T02:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.845286 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.845311 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:14 crc kubenswrapper[4964]: E1004 02:41:14.845485 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:14 crc kubenswrapper[4964]: E1004 02:41:14.845611 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.845691 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:14 crc kubenswrapper[4964]: E1004 02:41:14.845806 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.861200 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.861282 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.861308 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.861342 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.861377 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:14Z","lastTransitionTime":"2025-10-04T02:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.964661 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.964746 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.964768 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.964799 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:14 crc kubenswrapper[4964]: I1004 02:41:14.964820 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:14Z","lastTransitionTime":"2025-10-04T02:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.068106 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.068171 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.068188 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.068213 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.068230 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:15Z","lastTransitionTime":"2025-10-04T02:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.171732 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.171795 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.171812 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.171837 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.171883 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:15Z","lastTransitionTime":"2025-10-04T02:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.275002 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.275076 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.275093 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.275118 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.275135 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:15Z","lastTransitionTime":"2025-10-04T02:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.378098 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.378165 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.378182 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.378207 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.378228 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:15Z","lastTransitionTime":"2025-10-04T02:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.481195 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.481272 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.481291 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.481320 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.481338 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:15Z","lastTransitionTime":"2025-10-04T02:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.584379 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.584470 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.584487 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.584509 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.584528 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:15Z","lastTransitionTime":"2025-10-04T02:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.687577 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.687697 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.687716 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.687742 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.687759 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:15Z","lastTransitionTime":"2025-10-04T02:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.791893 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.791962 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.791980 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.792005 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.792022 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:15Z","lastTransitionTime":"2025-10-04T02:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.845026 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:15 crc kubenswrapper[4964]: E1004 02:41:15.845199 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.895347 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.895414 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.895432 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.895457 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.895475 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:15Z","lastTransitionTime":"2025-10-04T02:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.999096 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.999166 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.999190 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.999220 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:15 crc kubenswrapper[4964]: I1004 02:41:15.999244 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:15Z","lastTransitionTime":"2025-10-04T02:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.102546 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.102609 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.102652 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.102679 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.102696 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:16Z","lastTransitionTime":"2025-10-04T02:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.205665 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.206043 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.206244 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.206507 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.206756 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:16Z","lastTransitionTime":"2025-10-04T02:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.309872 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.309958 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.309976 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.310430 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.310481 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:16Z","lastTransitionTime":"2025-10-04T02:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.413231 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.413286 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.413310 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.413332 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.413349 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:16Z","lastTransitionTime":"2025-10-04T02:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.516428 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.516486 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.516502 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.516526 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.516544 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:16Z","lastTransitionTime":"2025-10-04T02:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.620467 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.620532 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.620552 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.620574 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.620591 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:16Z","lastTransitionTime":"2025-10-04T02:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.723948 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.724009 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.724026 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.724095 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.724119 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:16Z","lastTransitionTime":"2025-10-04T02:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.829818 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.830009 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.830029 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.830053 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.830073 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:16Z","lastTransitionTime":"2025-10-04T02:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.845219 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.845251 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.845288 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:16 crc kubenswrapper[4964]: E1004 02:41:16.845401 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:16 crc kubenswrapper[4964]: E1004 02:41:16.845510 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:16 crc kubenswrapper[4964]: E1004 02:41:16.845606 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.933072 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.933139 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.933162 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.933189 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:16 crc kubenswrapper[4964]: I1004 02:41:16.933211 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:16Z","lastTransitionTime":"2025-10-04T02:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.036871 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.036967 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.036987 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.037461 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.037544 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.141555 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.141666 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.141693 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.141723 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.141747 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.244846 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.244907 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.244924 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.244954 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.244972 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.349181 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.349239 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.349257 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.349282 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.349300 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.452538 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.452611 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.452682 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.452712 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.452734 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.556263 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.556321 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.556337 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.556357 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.556372 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.659373 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.659440 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.659458 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.659485 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.659505 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.715248 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.715289 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.715298 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.715315 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.715326 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: E1004 02:41:17.728155 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:17Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.733214 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.733246 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.733257 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.733274 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.733284 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: E1004 02:41:17.750447 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:17Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.755212 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.755235 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.755244 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.755256 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.755264 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: E1004 02:41:17.773260 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:17Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.777951 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.778001 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.778012 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.778034 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.778048 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: E1004 02:41:17.796557 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:17Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.800904 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.800953 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.800972 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.800992 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.801009 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: E1004 02:41:17.820411 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:17Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:17 crc kubenswrapper[4964]: E1004 02:41:17.820718 4964 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.822675 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.822739 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.822753 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.822771 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.822807 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.845038 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:17 crc kubenswrapper[4964]: E1004 02:41:17.845219 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.925678 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.925779 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.925799 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.925878 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:17 crc kubenswrapper[4964]: I1004 02:41:17.925898 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:17Z","lastTransitionTime":"2025-10-04T02:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.029976 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.030017 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.030033 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.030055 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.030071 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:18Z","lastTransitionTime":"2025-10-04T02:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.133470 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.133533 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.133552 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.133574 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.133592 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:18Z","lastTransitionTime":"2025-10-04T02:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.237195 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.237266 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.237287 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.237315 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.237336 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:18Z","lastTransitionTime":"2025-10-04T02:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.340354 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.340404 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.340416 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.340432 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.340445 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:18Z","lastTransitionTime":"2025-10-04T02:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.443933 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.443995 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.444015 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.444040 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.444057 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:18Z","lastTransitionTime":"2025-10-04T02:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.546977 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.547046 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.547067 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.547092 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.547110 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:18Z","lastTransitionTime":"2025-10-04T02:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.649703 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.649765 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.649786 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.649869 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.649908 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:18Z","lastTransitionTime":"2025-10-04T02:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.757336 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.757390 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.757409 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.757431 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.757448 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:18Z","lastTransitionTime":"2025-10-04T02:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.844655 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:18 crc kubenswrapper[4964]: E1004 02:41:18.845174 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.844880 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.844702 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:18 crc kubenswrapper[4964]: E1004 02:41:18.846405 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:18 crc kubenswrapper[4964]: E1004 02:41:18.846598 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.860022 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.860067 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.860078 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.860094 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.860108 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:18Z","lastTransitionTime":"2025-10-04T02:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.963274 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.963346 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.963367 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.963402 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:18 crc kubenswrapper[4964]: I1004 02:41:18.963425 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:18Z","lastTransitionTime":"2025-10-04T02:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.066490 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.066539 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.066555 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.066590 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.066607 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:19Z","lastTransitionTime":"2025-10-04T02:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.169440 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.169496 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.169513 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.169537 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.169557 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:19Z","lastTransitionTime":"2025-10-04T02:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.272345 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.272413 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.272435 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.272466 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.272483 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:19Z","lastTransitionTime":"2025-10-04T02:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.375548 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.375606 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.375656 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.375680 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.375698 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:19Z","lastTransitionTime":"2025-10-04T02:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.478610 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.478701 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.478723 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.478751 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.478771 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:19Z","lastTransitionTime":"2025-10-04T02:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.581587 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.582022 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.582250 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.582442 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.582607 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:19Z","lastTransitionTime":"2025-10-04T02:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.685663 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.686097 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.686335 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.686799 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.687232 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:19Z","lastTransitionTime":"2025-10-04T02:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.790295 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.790334 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.790343 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.790359 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.790368 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:19Z","lastTransitionTime":"2025-10-04T02:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.844839 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:19 crc kubenswrapper[4964]: E1004 02:41:19.845020 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.893250 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.893309 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.893325 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.893350 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.893385 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:19Z","lastTransitionTime":"2025-10-04T02:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.996220 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.996341 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.996364 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.996391 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:19 crc kubenswrapper[4964]: I1004 02:41:19.996412 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:19Z","lastTransitionTime":"2025-10-04T02:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.099524 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.099584 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.099607 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.099667 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.099688 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:20Z","lastTransitionTime":"2025-10-04T02:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.202957 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.203006 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.203031 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.203058 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.203080 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:20Z","lastTransitionTime":"2025-10-04T02:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.305771 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.305825 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.305842 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.305865 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.305882 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:20Z","lastTransitionTime":"2025-10-04T02:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.408809 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.408872 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.408889 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.408912 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.408934 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:20Z","lastTransitionTime":"2025-10-04T02:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.511809 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.511876 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.511899 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.511927 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.511947 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:20Z","lastTransitionTime":"2025-10-04T02:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.615245 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.615301 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.615320 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.615343 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.615361 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:20Z","lastTransitionTime":"2025-10-04T02:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.718026 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.718079 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.718096 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.718117 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.718136 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:20Z","lastTransitionTime":"2025-10-04T02:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.822670 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.823088 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.823108 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.823134 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.823153 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:20Z","lastTransitionTime":"2025-10-04T02:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.844258 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.844330 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:20 crc kubenswrapper[4964]: E1004 02:41:20.844404 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.844341 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:20 crc kubenswrapper[4964]: E1004 02:41:20.844545 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:20 crc kubenswrapper[4964]: E1004 02:41:20.844765 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.864798 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:20Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.879557 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:20Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.896548 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:20Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.915723 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:20Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.925305 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.925356 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.925380 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.925400 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.925412 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:20Z","lastTransitionTime":"2025-10-04T02:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.934391 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:20Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.954140 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:20Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.974466 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:20Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:20 crc kubenswrapper[4964]: I1004 02:41:20.991693 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:20Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.012518 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:21Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.027306 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:21Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.028573 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.028628 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.028643 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.028662 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.028675 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:21Z","lastTransitionTime":"2025-10-04T02:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.059013 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:08Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:41:07.989751 6606 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1004 02:41:07.989781 6606 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1004 02:41:07.989817 6606 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 02:41:07.989821 6606 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1004 02:41:07.989834 6606 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1004 02:41:07.989857 6606 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1004 02:41:07.989868 6606 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1004 02:41:07.989893 6606 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1004 02:41:07.989902 6606 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1004 02:41:07.989917 6606 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1004 02:41:07.989922 6606 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1004 02:41:07.989940 6606 factory.go:656] Stopping watch factory\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.Node event handler 7\\\\nI1004 02:41:07.989972 6606 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:21Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.080432 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:21Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.099159 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"104ac332-90bc-4a0d-a085-0e47997cfdd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391bbf5e9b0494eb2fb5ab2d016230d876a51bd83eb52e759819dd11d54c1dc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79910f39f896467b97e336bf2d9ddc46fc68ce22e15162a78865049569c03c81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94e58235a34a6260cb38523b1823b73a5083dcff7f3af4167635bc394289072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:21Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.120356 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:21Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.130993 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.131061 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.131087 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.131119 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.131142 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:21Z","lastTransitionTime":"2025-10-04T02:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.136874 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:21Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.163717 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:21Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.183427 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:21Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.233744 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.233805 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.233823 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.233846 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.233863 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:21Z","lastTransitionTime":"2025-10-04T02:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.336884 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.336975 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.336994 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.337017 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.337035 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:21Z","lastTransitionTime":"2025-10-04T02:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.439128 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.439186 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.439204 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.439225 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.439240 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:21Z","lastTransitionTime":"2025-10-04T02:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.542190 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.542252 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.542270 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.542296 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.542332 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:21Z","lastTransitionTime":"2025-10-04T02:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.644976 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.645054 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.645077 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.645107 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.645131 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:21Z","lastTransitionTime":"2025-10-04T02:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.748794 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.748856 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.748871 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.748899 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.748916 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:21Z","lastTransitionTime":"2025-10-04T02:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.845124 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:21 crc kubenswrapper[4964]: E1004 02:41:21.845352 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.846448 4964 scope.go:117] "RemoveContainer" containerID="2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d" Oct 04 02:41:21 crc kubenswrapper[4964]: E1004 02:41:21.846841 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.851127 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.851191 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.851209 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.851238 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.851257 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:21Z","lastTransitionTime":"2025-10-04T02:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.954221 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.954280 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.954296 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.954319 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:21 crc kubenswrapper[4964]: I1004 02:41:21.954337 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:21Z","lastTransitionTime":"2025-10-04T02:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.056864 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.056948 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.056970 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.057001 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.057026 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:22Z","lastTransitionTime":"2025-10-04T02:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.160560 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.160656 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.160675 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.160699 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.160715 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:22Z","lastTransitionTime":"2025-10-04T02:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.264239 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.264293 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.264313 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.264339 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.264357 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:22Z","lastTransitionTime":"2025-10-04T02:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.367448 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.367501 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.367525 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.367553 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.367576 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:22Z","lastTransitionTime":"2025-10-04T02:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.470751 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.470827 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.470850 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.470880 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.470911 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:22Z","lastTransitionTime":"2025-10-04T02:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.573527 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.573597 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.573653 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.573685 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.573709 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:22Z","lastTransitionTime":"2025-10-04T02:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.677099 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.677155 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.677172 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.677243 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.677262 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:22Z","lastTransitionTime":"2025-10-04T02:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.781951 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.782021 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.782043 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.782087 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.782110 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:22Z","lastTransitionTime":"2025-10-04T02:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.844562 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:22 crc kubenswrapper[4964]: E1004 02:41:22.844761 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.845025 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:22 crc kubenswrapper[4964]: E1004 02:41:22.845205 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.845044 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:22 crc kubenswrapper[4964]: E1004 02:41:22.845604 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.885151 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.885210 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.885228 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.885252 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.885268 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:22Z","lastTransitionTime":"2025-10-04T02:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.987705 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.987763 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.987780 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.987806 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:22 crc kubenswrapper[4964]: I1004 02:41:22.987824 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:22Z","lastTransitionTime":"2025-10-04T02:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.090183 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.090240 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.090256 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.090280 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.090297 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:23Z","lastTransitionTime":"2025-10-04T02:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.193499 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.193542 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.193553 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.193570 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.193583 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:23Z","lastTransitionTime":"2025-10-04T02:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.296171 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.296215 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.296227 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.296244 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.296257 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:23Z","lastTransitionTime":"2025-10-04T02:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.398995 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.399051 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.399068 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.399090 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.399107 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:23Z","lastTransitionTime":"2025-10-04T02:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.501969 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.502063 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.502085 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.502142 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.502159 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:23Z","lastTransitionTime":"2025-10-04T02:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.605722 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.605820 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.605840 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.605871 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.605893 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:23Z","lastTransitionTime":"2025-10-04T02:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.708468 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.708540 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.708562 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.708592 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.708645 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:23Z","lastTransitionTime":"2025-10-04T02:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.812068 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.812136 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.812152 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.812181 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.812207 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:23Z","lastTransitionTime":"2025-10-04T02:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.844870 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:23 crc kubenswrapper[4964]: E1004 02:41:23.845043 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.915351 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.915493 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.915523 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.915552 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:23 crc kubenswrapper[4964]: I1004 02:41:23.915574 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:23Z","lastTransitionTime":"2025-10-04T02:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.018146 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.018200 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.018217 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.018239 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.018256 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:24Z","lastTransitionTime":"2025-10-04T02:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.120913 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.120983 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.121000 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.121023 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.121040 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:24Z","lastTransitionTime":"2025-10-04T02:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.223827 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.223915 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.223938 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.223968 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.223989 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:24Z","lastTransitionTime":"2025-10-04T02:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.326148 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.326230 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.326249 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.326279 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.326299 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:24Z","lastTransitionTime":"2025-10-04T02:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.428796 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.428851 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.428865 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.428884 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.428897 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:24Z","lastTransitionTime":"2025-10-04T02:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.532750 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.532810 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.532826 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.532850 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.532878 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:24Z","lastTransitionTime":"2025-10-04T02:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.636060 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.636123 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.636140 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.636164 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.636182 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:24Z","lastTransitionTime":"2025-10-04T02:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.739141 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.739202 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.739219 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.739243 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.739262 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:24Z","lastTransitionTime":"2025-10-04T02:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.842701 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.842747 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.842779 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.842803 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.842820 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:24Z","lastTransitionTime":"2025-10-04T02:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.849007 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:24 crc kubenswrapper[4964]: E1004 02:41:24.849171 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.849757 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:24 crc kubenswrapper[4964]: E1004 02:41:24.849896 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.849971 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:24 crc kubenswrapper[4964]: E1004 02:41:24.850050 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.947193 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.947269 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.947284 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.947306 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:24 crc kubenswrapper[4964]: I1004 02:41:24.947323 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:24Z","lastTransitionTime":"2025-10-04T02:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.050206 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.050283 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.050301 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.050327 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.050346 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:25Z","lastTransitionTime":"2025-10-04T02:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.154283 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.154346 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.154357 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.154379 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.154395 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:25Z","lastTransitionTime":"2025-10-04T02:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.258143 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.258203 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.258220 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.258244 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.258261 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:25Z","lastTransitionTime":"2025-10-04T02:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.360935 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.360993 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.361009 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.361033 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.361054 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:25Z","lastTransitionTime":"2025-10-04T02:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.463854 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.463891 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.463904 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.463921 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.463933 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:25Z","lastTransitionTime":"2025-10-04T02:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.567413 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.567472 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.567483 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.567503 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.567516 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:25Z","lastTransitionTime":"2025-10-04T02:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.670102 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.670149 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.670157 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.670171 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.670182 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:25Z","lastTransitionTime":"2025-10-04T02:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.773602 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.773657 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.773666 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.773683 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.773693 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:25Z","lastTransitionTime":"2025-10-04T02:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.844552 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:25 crc kubenswrapper[4964]: E1004 02:41:25.844753 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.876004 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.876067 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.876077 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.876118 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.876128 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:25Z","lastTransitionTime":"2025-10-04T02:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.978788 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.978860 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.978878 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.978902 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:25 crc kubenswrapper[4964]: I1004 02:41:25.978921 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:25Z","lastTransitionTime":"2025-10-04T02:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.081522 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.081563 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.081623 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.081642 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.081654 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:26Z","lastTransitionTime":"2025-10-04T02:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.184457 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.184529 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.184547 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.184574 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.184592 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:26Z","lastTransitionTime":"2025-10-04T02:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.287460 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.287508 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.287521 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.287536 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.287545 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:26Z","lastTransitionTime":"2025-10-04T02:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.389426 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.389482 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.389497 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.389520 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.389537 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:26Z","lastTransitionTime":"2025-10-04T02:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.492939 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.492980 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.492989 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.493004 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.493018 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:26Z","lastTransitionTime":"2025-10-04T02:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.596522 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.596609 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.596674 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.596707 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.596727 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:26Z","lastTransitionTime":"2025-10-04T02:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.700249 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.700376 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.700404 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.700437 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.700461 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:26Z","lastTransitionTime":"2025-10-04T02:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.803401 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.803936 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.804037 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.804141 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.804241 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:26Z","lastTransitionTime":"2025-10-04T02:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.844924 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.845078 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.844950 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:26 crc kubenswrapper[4964]: E1004 02:41:26.845191 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:26 crc kubenswrapper[4964]: E1004 02:41:26.845302 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:26 crc kubenswrapper[4964]: E1004 02:41:26.845504 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.907304 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.907355 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.907368 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.907387 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:26 crc kubenswrapper[4964]: I1004 02:41:26.907400 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:26Z","lastTransitionTime":"2025-10-04T02:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.010467 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.010523 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.010537 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.010562 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.010579 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:27Z","lastTransitionTime":"2025-10-04T02:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.113711 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.113770 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.113787 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.113815 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.113832 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:27Z","lastTransitionTime":"2025-10-04T02:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.217064 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.217126 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.217145 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.217169 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.217189 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:27Z","lastTransitionTime":"2025-10-04T02:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.320041 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.320094 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.320115 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.320143 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.320167 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:27Z","lastTransitionTime":"2025-10-04T02:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.423185 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.423262 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.423285 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.423317 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.423341 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:27Z","lastTransitionTime":"2025-10-04T02:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.526339 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.526407 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.526427 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.526451 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.526468 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:27Z","lastTransitionTime":"2025-10-04T02:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.628904 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.628974 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.628993 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.629159 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.629379 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:27Z","lastTransitionTime":"2025-10-04T02:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.732755 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.732820 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.732833 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.732853 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.732868 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:27Z","lastTransitionTime":"2025-10-04T02:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.835755 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.835821 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.835838 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.835863 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.835880 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:27Z","lastTransitionTime":"2025-10-04T02:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.845035 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:27 crc kubenswrapper[4964]: E1004 02:41:27.845162 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.938180 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.938230 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.938243 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.938260 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:27 crc kubenswrapper[4964]: I1004 02:41:27.938273 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:27Z","lastTransitionTime":"2025-10-04T02:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.041658 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.041731 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.041780 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.041808 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.041826 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.102776 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.102838 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.102856 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.102879 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.102899 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: E1004 02:41:28.120111 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:28Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.124312 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.124362 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.124378 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.124400 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.124416 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: E1004 02:41:28.138070 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:28Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.144023 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.144094 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.144118 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.144142 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.144158 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: E1004 02:41:28.164525 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:28Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.169089 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.169146 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.169163 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.169186 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.169203 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: E1004 02:41:28.181179 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:28Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.184806 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.184879 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.184898 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.184922 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.184940 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: E1004 02:41:28.200721 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:28Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:28 crc kubenswrapper[4964]: E1004 02:41:28.200838 4964 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.202578 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.202604 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.202626 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.202640 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.202651 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.305462 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.305534 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.305548 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.305568 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.305581 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.407027 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.407068 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.407080 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.407098 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.407108 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.509020 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.509057 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.509065 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.509081 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.509090 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.611332 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.611371 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.611380 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.611394 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.611403 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.714037 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.714071 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.714078 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.714092 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.714101 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.807881 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:28 crc kubenswrapper[4964]: E1004 02:41:28.808115 4964 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:41:28 crc kubenswrapper[4964]: E1004 02:41:28.808239 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs podName:7f1c9150-b444-41bb-9233-d76c4765a2d0 nodeName:}" failed. No retries permitted until 2025-10-04 02:42:00.808207711 +0000 UTC m=+100.705166379 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs") pod "network-metrics-daemon-xrr6r" (UID: "7f1c9150-b444-41bb-9233-d76c4765a2d0") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.816786 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.816843 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.816860 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.816888 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.816906 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.844494 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.844564 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.844519 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:28 crc kubenswrapper[4964]: E1004 02:41:28.844742 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:28 crc kubenswrapper[4964]: E1004 02:41:28.845185 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:28 crc kubenswrapper[4964]: E1004 02:41:28.845291 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.918975 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.919034 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.919050 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.919073 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:28 crc kubenswrapper[4964]: I1004 02:41:28.919090 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:28Z","lastTransitionTime":"2025-10-04T02:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.022090 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.022124 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.022132 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.022146 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.022155 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:29Z","lastTransitionTime":"2025-10-04T02:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.124316 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.124371 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.124389 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.124411 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.124428 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:29Z","lastTransitionTime":"2025-10-04T02:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.227003 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.227090 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.227101 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.227118 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.227129 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:29Z","lastTransitionTime":"2025-10-04T02:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.329280 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.329345 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.329364 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.329389 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.329410 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:29Z","lastTransitionTime":"2025-10-04T02:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.431644 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.431684 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.431700 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.431719 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.431735 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:29Z","lastTransitionTime":"2025-10-04T02:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.533477 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.533549 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.533569 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.533594 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.533611 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:29Z","lastTransitionTime":"2025-10-04T02:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.636262 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.636298 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.636310 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.636325 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.636339 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:29Z","lastTransitionTime":"2025-10-04T02:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.738688 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.738736 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.738752 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.738774 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.738790 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:29Z","lastTransitionTime":"2025-10-04T02:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.841325 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.841357 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.841368 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.841385 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.841395 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:29Z","lastTransitionTime":"2025-10-04T02:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.845102 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:29 crc kubenswrapper[4964]: E1004 02:41:29.845371 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.856202 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.943288 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.943347 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.943362 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.943386 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:29 crc kubenswrapper[4964]: I1004 02:41:29.943404 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:29Z","lastTransitionTime":"2025-10-04T02:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.045069 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.045136 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.045153 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.045183 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.045200 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:30Z","lastTransitionTime":"2025-10-04T02:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.147155 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.147214 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.147233 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.147257 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.147277 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:30Z","lastTransitionTime":"2025-10-04T02:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.250258 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.250329 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.250346 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.250372 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.250388 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:30Z","lastTransitionTime":"2025-10-04T02:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.353031 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.353098 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.353114 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.353145 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.353165 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:30Z","lastTransitionTime":"2025-10-04T02:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.376012 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q6hm8_10ea848d-0322-476d-976d-4ae3ac39910b/kube-multus/0.log" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.376077 4964 generic.go:334] "Generic (PLEG): container finished" podID="10ea848d-0322-476d-976d-4ae3ac39910b" containerID="b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709" exitCode=1 Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.376768 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q6hm8" event={"ID":"10ea848d-0322-476d-976d-4ae3ac39910b","Type":"ContainerDied","Data":"b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709"} Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.377146 4964 scope.go:117] "RemoveContainer" containerID="b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.392594 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.405142 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.424198 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.437343 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.452407 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:29Z\\\",\\\"message\\\":\\\"2025-10-04T02:40:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b\\\\n2025-10-04T02:40:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b to /host/opt/cni/bin/\\\\n2025-10-04T02:40:44Z [verbose] multus-daemon started\\\\n2025-10-04T02:40:44Z [verbose] Readiness Indicator file check\\\\n2025-10-04T02:41:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.456300 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.456343 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.456359 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.456382 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.456399 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:30Z","lastTransitionTime":"2025-10-04T02:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.470871 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:08Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:41:07.989751 6606 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1004 02:41:07.989781 6606 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1004 02:41:07.989817 6606 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 02:41:07.989821 6606 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1004 02:41:07.989834 6606 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1004 02:41:07.989857 6606 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1004 02:41:07.989868 6606 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1004 02:41:07.989893 6606 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1004 02:41:07.989902 6606 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1004 02:41:07.989917 6606 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1004 02:41:07.989922 6606 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1004 02:41:07.989940 6606 factory.go:656] Stopping watch factory\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.Node event handler 7\\\\nI1004 02:41:07.989972 6606 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.486337 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.506721 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"104ac332-90bc-4a0d-a085-0e47997cfdd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391bbf5e9b0494eb2fb5ab2d016230d876a51bd83eb52e759819dd11d54c1dc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79910f39f896467b97e336bf2d9ddc46fc68ce22e15162a78865049569c03c81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94e58235a34a6260cb38523b1823b73a5083dcff7f3af4167635bc394289072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.520146 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2d0d43-54d5-4b73-a227-0867264eee27\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87fdd14ec19d65c508672f966db71a585458cbea0a944a6960b56d8e0160eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.539760 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.553281 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.558777 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.558839 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.558861 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.558891 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.558913 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:30Z","lastTransitionTime":"2025-10-04T02:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.567663 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.580120 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.602349 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.612545 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.624963 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.637847 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.651673 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.661385 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.661409 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.661416 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.661429 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.661438 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:30Z","lastTransitionTime":"2025-10-04T02:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.763110 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.763157 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.763173 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.763193 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.763211 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:30Z","lastTransitionTime":"2025-10-04T02:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.845214 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.845267 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.845319 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:30 crc kubenswrapper[4964]: E1004 02:41:30.845390 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:30 crc kubenswrapper[4964]: E1004 02:41:30.845507 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:30 crc kubenswrapper[4964]: E1004 02:41:30.845663 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.862728 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.865778 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.865848 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.865872 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.865900 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.865921 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:30Z","lastTransitionTime":"2025-10-04T02:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.873856 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.890043 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.907851 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:29Z\\\",\\\"message\\\":\\\"2025-10-04T02:40:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b\\\\n2025-10-04T02:40:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b to /host/opt/cni/bin/\\\\n2025-10-04T02:40:44Z [verbose] multus-daemon started\\\\n2025-10-04T02:40:44Z [verbose] Readiness Indicator file check\\\\n2025-10-04T02:41:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.938364 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:08Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:41:07.989751 6606 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1004 02:41:07.989781 6606 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1004 02:41:07.989817 6606 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 02:41:07.989821 6606 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1004 02:41:07.989834 6606 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1004 02:41:07.989857 6606 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1004 02:41:07.989868 6606 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1004 02:41:07.989893 6606 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1004 02:41:07.989902 6606 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1004 02:41:07.989917 6606 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1004 02:41:07.989922 6606 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1004 02:41:07.989940 6606 factory.go:656] Stopping watch factory\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.Node event handler 7\\\\nI1004 02:41:07.989972 6606 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.954807 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.968002 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.968058 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.968075 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.968099 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.968118 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:30Z","lastTransitionTime":"2025-10-04T02:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.972247 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"104ac332-90bc-4a0d-a085-0e47997cfdd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391bbf5e9b0494eb2fb5ab2d016230d876a51bd83eb52e759819dd11d54c1dc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79910f39f896467b97e336bf2d9ddc46fc68ce22e15162a78865049569c03c81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94e58235a34a6260cb38523b1823b73a5083dcff7f3af4167635bc394289072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:30 crc kubenswrapper[4964]: I1004 02:41:30.983327 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2d0d43-54d5-4b73-a227-0867264eee27\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87fdd14ec19d65c508672f966db71a585458cbea0a944a6960b56d8e0160eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.001064 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:30Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.013485 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.031363 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.043148 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.064124 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.071642 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.071719 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.071737 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.071758 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.073751 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:31Z","lastTransitionTime":"2025-10-04T02:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.080472 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.099192 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.116100 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.134049 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.150914 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.177214 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.177253 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.177263 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.177278 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.177290 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:31Z","lastTransitionTime":"2025-10-04T02:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.279957 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.280043 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.280061 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.280597 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.280694 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:31Z","lastTransitionTime":"2025-10-04T02:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.384725 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.384778 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.384796 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.384819 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.384837 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:31Z","lastTransitionTime":"2025-10-04T02:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.385485 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q6hm8_10ea848d-0322-476d-976d-4ae3ac39910b/kube-multus/0.log" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.385554 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q6hm8" event={"ID":"10ea848d-0322-476d-976d-4ae3ac39910b","Type":"ContainerStarted","Data":"a79e79fc7fe0ec6d21d20e93ea04d085d1d23ee8bf1a2c766f100bf1d53b804d"} Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.403149 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e79fc7fe0ec6d21d20e93ea04d085d1d23ee8bf1a2c766f100bf1d53b804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:29Z\\\",\\\"message\\\":\\\"2025-10-04T02:40:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b\\\\n2025-10-04T02:40:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b to /host/opt/cni/bin/\\\\n2025-10-04T02:40:44Z [verbose] multus-daemon started\\\\n2025-10-04T02:40:44Z [verbose] Readiness Indicator file check\\\\n2025-10-04T02:41:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.427825 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:08Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:41:07.989751 6606 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1004 02:41:07.989781 6606 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1004 02:41:07.989817 6606 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 02:41:07.989821 6606 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1004 02:41:07.989834 6606 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1004 02:41:07.989857 6606 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1004 02:41:07.989868 6606 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1004 02:41:07.989893 6606 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1004 02:41:07.989902 6606 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1004 02:41:07.989917 6606 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1004 02:41:07.989922 6606 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1004 02:41:07.989940 6606 factory.go:656] Stopping watch factory\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.Node event handler 7\\\\nI1004 02:41:07.989972 6606 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.446924 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.466465 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"104ac332-90bc-4a0d-a085-0e47997cfdd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391bbf5e9b0494eb2fb5ab2d016230d876a51bd83eb52e759819dd11d54c1dc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79910f39f896467b97e336bf2d9ddc46fc68ce22e15162a78865049569c03c81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94e58235a34a6260cb38523b1823b73a5083dcff7f3af4167635bc394289072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.484448 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2d0d43-54d5-4b73-a227-0867264eee27\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87fdd14ec19d65c508672f966db71a585458cbea0a944a6960b56d8e0160eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.487535 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.487596 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.487643 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.487671 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.487709 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:31Z","lastTransitionTime":"2025-10-04T02:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.506254 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.522999 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.542046 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.559767 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.583012 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.590553 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.590610 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.590669 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.590702 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.590725 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:31Z","lastTransitionTime":"2025-10-04T02:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.598146 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.616412 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.639016 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.658206 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.675287 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.692700 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.693702 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.693750 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.693772 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.693798 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.693818 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:31Z","lastTransitionTime":"2025-10-04T02:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.705893 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.722651 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:31Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.797016 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.797061 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.797071 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.797088 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.797100 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:31Z","lastTransitionTime":"2025-10-04T02:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.844871 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:31 crc kubenswrapper[4964]: E1004 02:41:31.845104 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.900656 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.900735 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.900758 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.900783 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:31 crc kubenswrapper[4964]: I1004 02:41:31.900802 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:31Z","lastTransitionTime":"2025-10-04T02:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.003452 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.003515 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.003536 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.003559 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.003580 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:32Z","lastTransitionTime":"2025-10-04T02:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.110392 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.110662 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.110734 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.110770 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.110806 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:32Z","lastTransitionTime":"2025-10-04T02:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.214125 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.214185 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.214201 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.214227 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.214248 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:32Z","lastTransitionTime":"2025-10-04T02:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.317286 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.317398 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.317418 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.317442 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.317461 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:32Z","lastTransitionTime":"2025-10-04T02:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.420084 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.420130 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.420140 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.420155 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.420165 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:32Z","lastTransitionTime":"2025-10-04T02:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.523394 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.523435 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.523444 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.523459 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.523468 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:32Z","lastTransitionTime":"2025-10-04T02:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.626500 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.626560 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.626577 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.626601 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.626652 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:32Z","lastTransitionTime":"2025-10-04T02:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.729970 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.730075 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.730102 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.730133 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.730155 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:32Z","lastTransitionTime":"2025-10-04T02:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.832704 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.832756 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.832774 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.832801 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.832818 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:32Z","lastTransitionTime":"2025-10-04T02:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.845087 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.845137 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.845102 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:32 crc kubenswrapper[4964]: E1004 02:41:32.845281 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:32 crc kubenswrapper[4964]: E1004 02:41:32.845362 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:32 crc kubenswrapper[4964]: E1004 02:41:32.845432 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.935763 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.935804 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.935812 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.935827 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:32 crc kubenswrapper[4964]: I1004 02:41:32.935840 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:32Z","lastTransitionTime":"2025-10-04T02:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.038892 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.038936 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.038945 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.038960 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.038971 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:33Z","lastTransitionTime":"2025-10-04T02:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.141493 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.141553 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.141571 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.141593 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.141611 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:33Z","lastTransitionTime":"2025-10-04T02:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.244115 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.244171 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.244187 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.244209 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.244233 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:33Z","lastTransitionTime":"2025-10-04T02:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.347200 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.347264 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.347279 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.347295 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.347307 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:33Z","lastTransitionTime":"2025-10-04T02:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.449446 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.449512 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.449533 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.449563 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.449583 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:33Z","lastTransitionTime":"2025-10-04T02:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.552170 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.552235 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.552249 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.552269 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.552284 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:33Z","lastTransitionTime":"2025-10-04T02:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.654844 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.654897 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.654914 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.654939 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.654957 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:33Z","lastTransitionTime":"2025-10-04T02:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.757576 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.757680 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.757702 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.757727 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.757743 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:33Z","lastTransitionTime":"2025-10-04T02:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.844464 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:33 crc kubenswrapper[4964]: E1004 02:41:33.844596 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.860479 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.860533 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.860551 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.860575 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.860594 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:33Z","lastTransitionTime":"2025-10-04T02:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.963380 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.963440 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.963459 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.963481 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:33 crc kubenswrapper[4964]: I1004 02:41:33.963498 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:33Z","lastTransitionTime":"2025-10-04T02:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.066303 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.066379 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.066398 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.066423 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.066441 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:34Z","lastTransitionTime":"2025-10-04T02:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.168851 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.168890 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.168899 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.168913 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.168925 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:34Z","lastTransitionTime":"2025-10-04T02:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.272142 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.272198 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.272217 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.272244 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.272262 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:34Z","lastTransitionTime":"2025-10-04T02:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.374892 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.374937 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.374947 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.374963 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.374978 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:34Z","lastTransitionTime":"2025-10-04T02:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.477520 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.477583 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.477604 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.477657 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.477678 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:34Z","lastTransitionTime":"2025-10-04T02:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.580156 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.580195 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.580203 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.580219 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.580230 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:34Z","lastTransitionTime":"2025-10-04T02:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.683440 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.683530 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.683550 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.683576 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.683594 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:34Z","lastTransitionTime":"2025-10-04T02:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.786659 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.786747 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.786779 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.786813 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.786838 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:34Z","lastTransitionTime":"2025-10-04T02:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.844583 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.844652 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:34 crc kubenswrapper[4964]: E1004 02:41:34.844792 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:34 crc kubenswrapper[4964]: E1004 02:41:34.845000 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.845088 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:34 crc kubenswrapper[4964]: E1004 02:41:34.845259 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.889978 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.890056 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.890074 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.890100 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.890123 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:34Z","lastTransitionTime":"2025-10-04T02:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.997358 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.997393 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.997402 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.997418 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:34 crc kubenswrapper[4964]: I1004 02:41:34.997428 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:34Z","lastTransitionTime":"2025-10-04T02:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.100235 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.100341 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.100360 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.100423 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.100441 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:35Z","lastTransitionTime":"2025-10-04T02:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.203478 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.203531 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.203550 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.203575 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.203594 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:35Z","lastTransitionTime":"2025-10-04T02:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.306719 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.306801 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.306827 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.306853 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.306871 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:35Z","lastTransitionTime":"2025-10-04T02:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.409261 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.409326 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.409344 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.409375 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.409397 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:35Z","lastTransitionTime":"2025-10-04T02:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.511596 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.511666 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.511683 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.511705 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.511721 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:35Z","lastTransitionTime":"2025-10-04T02:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.615341 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.615428 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.615447 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.615470 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.615519 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:35Z","lastTransitionTime":"2025-10-04T02:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.718386 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.718424 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.718435 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.718451 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.718462 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:35Z","lastTransitionTime":"2025-10-04T02:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.821682 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.821770 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.821792 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.821819 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.821837 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:35Z","lastTransitionTime":"2025-10-04T02:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.845386 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:35 crc kubenswrapper[4964]: E1004 02:41:35.845607 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.847232 4964 scope.go:117] "RemoveContainer" containerID="2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.960641 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.960694 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.960711 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.960732 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:35 crc kubenswrapper[4964]: I1004 02:41:35.960751 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:35Z","lastTransitionTime":"2025-10-04T02:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.063382 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.063432 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.063448 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.063470 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.063488 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:36Z","lastTransitionTime":"2025-10-04T02:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.166808 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.166851 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.166863 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.166882 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.166894 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:36Z","lastTransitionTime":"2025-10-04T02:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.270678 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.270725 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.270737 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.270754 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.270766 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:36Z","lastTransitionTime":"2025-10-04T02:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.374075 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.374136 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.374152 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.374176 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.374194 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:36Z","lastTransitionTime":"2025-10-04T02:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.402885 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/2.log" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.406661 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b"} Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.407114 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.431732 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.447333 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.458842 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.477374 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.477418 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.477433 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.477451 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.477466 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:36Z","lastTransitionTime":"2025-10-04T02:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.485589 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.501384 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.516339 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e79fc7fe0ec6d21d20e93ea04d085d1d23ee8bf1a2c766f100bf1d53b804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:29Z\\\",\\\"message\\\":\\\"2025-10-04T02:40:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b\\\\n2025-10-04T02:40:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b to /host/opt/cni/bin/\\\\n2025-10-04T02:40:44Z [verbose] multus-daemon started\\\\n2025-10-04T02:40:44Z [verbose] Readiness Indicator file check\\\\n2025-10-04T02:41:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.536079 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:08Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:41:07.989751 6606 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1004 02:41:07.989781 6606 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1004 02:41:07.989817 6606 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 02:41:07.989821 6606 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1004 02:41:07.989834 6606 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1004 02:41:07.989857 6606 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1004 02:41:07.989868 6606 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1004 02:41:07.989893 6606 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1004 02:41:07.989902 6606 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1004 02:41:07.989917 6606 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1004 02:41:07.989922 6606 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1004 02:41:07.989940 6606 factory.go:656] Stopping watch factory\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.Node event handler 7\\\\nI1004 02:41:07.989972 6606 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.548891 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.561577 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"104ac332-90bc-4a0d-a085-0e47997cfdd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391bbf5e9b0494eb2fb5ab2d016230d876a51bd83eb52e759819dd11d54c1dc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79910f39f896467b97e336bf2d9ddc46fc68ce22e15162a78865049569c03c81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94e58235a34a6260cb38523b1823b73a5083dcff7f3af4167635bc394289072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.573024 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2d0d43-54d5-4b73-a227-0867264eee27\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87fdd14ec19d65c508672f966db71a585458cbea0a944a6960b56d8e0160eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.579979 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.580012 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.580021 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.580036 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.580046 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:36Z","lastTransitionTime":"2025-10-04T02:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.589633 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.601508 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.612066 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.623003 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.635734 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.644193 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.654675 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.666057 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:36Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.681901 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.681934 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.681942 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.681956 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.681965 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:36Z","lastTransitionTime":"2025-10-04T02:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.784944 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.784981 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.784989 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.785002 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.785011 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:36Z","lastTransitionTime":"2025-10-04T02:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.845031 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.845061 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:36 crc kubenswrapper[4964]: E1004 02:41:36.845152 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.845185 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:36 crc kubenswrapper[4964]: E1004 02:41:36.845282 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:36 crc kubenswrapper[4964]: E1004 02:41:36.845520 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.887283 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.887341 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.887366 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.887397 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.887420 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:36Z","lastTransitionTime":"2025-10-04T02:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.990392 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.990462 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.990484 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.990514 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:36 crc kubenswrapper[4964]: I1004 02:41:36.990536 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:36Z","lastTransitionTime":"2025-10-04T02:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.093354 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.093420 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.093458 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.093527 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.093553 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:37Z","lastTransitionTime":"2025-10-04T02:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.197086 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.197152 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.197174 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.197201 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.197222 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:37Z","lastTransitionTime":"2025-10-04T02:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.299688 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.299753 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.299774 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.299799 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.299819 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:37Z","lastTransitionTime":"2025-10-04T02:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.403417 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.403480 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.403498 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.403527 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.403553 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:37Z","lastTransitionTime":"2025-10-04T02:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.413451 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/3.log" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.414558 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/2.log" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.418691 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b" exitCode=1 Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.418794 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b"} Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.418895 4964 scope.go:117] "RemoveContainer" containerID="2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.419868 4964 scope.go:117] "RemoveContainer" containerID="fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b" Oct 04 02:41:37 crc kubenswrapper[4964]: E1004 02:41:37.420140 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.447795 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.467852 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"104ac332-90bc-4a0d-a085-0e47997cfdd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391bbf5e9b0494eb2fb5ab2d016230d876a51bd83eb52e759819dd11d54c1dc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79910f39f896467b97e336bf2d9ddc46fc68ce22e15162a78865049569c03c81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94e58235a34a6260cb38523b1823b73a5083dcff7f3af4167635bc394289072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.486792 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2d0d43-54d5-4b73-a227-0867264eee27\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87fdd14ec19d65c508672f966db71a585458cbea0a944a6960b56d8e0160eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.507127 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.507189 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.507206 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.507232 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.507252 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:37Z","lastTransitionTime":"2025-10-04T02:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.509795 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.534682 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.556169 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e79fc7fe0ec6d21d20e93ea04d085d1d23ee8bf1a2c766f100bf1d53b804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:29Z\\\",\\\"message\\\":\\\"2025-10-04T02:40:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b\\\\n2025-10-04T02:40:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b to /host/opt/cni/bin/\\\\n2025-10-04T02:40:44Z [verbose] multus-daemon started\\\\n2025-10-04T02:40:44Z [verbose] Readiness Indicator file check\\\\n2025-10-04T02:41:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.591127 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e910e0d7fe672fdf6d10cb6fa5b04565bc43828f263d686fe1771eb71373c8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:08Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 02:41:07.989751 6606 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1004 02:41:07.989781 6606 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1004 02:41:07.989817 6606 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 02:41:07.989821 6606 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1004 02:41:07.989834 6606 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1004 02:41:07.989857 6606 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1004 02:41:07.989868 6606 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1004 02:41:07.989893 6606 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1004 02:41:07.989902 6606 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1004 02:41:07.989917 6606 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1004 02:41:07.989922 6606 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1004 02:41:07.989940 6606 factory.go:656] Stopping watch factory\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1004 02:41:07.989954 6606 handler.go:208] Removed *v1.Node event handler 7\\\\nI1004 02:41:07.989972 6606 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:36Z\\\",\\\"message\\\":\\\"0:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 02:41:36.806362 6957 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-machine-webhook for network=default\\\\nI1004 02:41:36.804867 6957 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1004 02:41:36.806385 6957 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1004 02:41:36.806008 6957 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.607103 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.610747 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.610816 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.610834 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.610859 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.610877 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:37Z","lastTransitionTime":"2025-10-04T02:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.623390 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.642222 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.665548 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.686922 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.706971 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.714691 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.714753 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.714777 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.714807 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.714824 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:37Z","lastTransitionTime":"2025-10-04T02:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.725900 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.750312 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.767561 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.779697 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.799193 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:37Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.817476 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.817540 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.817558 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.817584 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.817605 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:37Z","lastTransitionTime":"2025-10-04T02:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.844998 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:37 crc kubenswrapper[4964]: E1004 02:41:37.845207 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.920884 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.920957 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.920981 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.921011 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:37 crc kubenswrapper[4964]: I1004 02:41:37.921033 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:37Z","lastTransitionTime":"2025-10-04T02:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.024371 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.024433 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.024454 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.024478 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.024498 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.127902 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.127979 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.127997 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.128027 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.128045 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.231400 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.231464 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.231482 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.231506 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.231523 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.334553 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.334609 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.334697 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.334725 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.334773 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.425306 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/3.log" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.430659 4964 scope.go:117] "RemoveContainer" containerID="fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b" Oct 04 02:41:38 crc kubenswrapper[4964]: E1004 02:41:38.430937 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.437084 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.437167 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.437193 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.437224 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.437247 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.451879 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.472187 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.491529 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.513464 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.529327 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.529490 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.529510 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.529534 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.529551 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.536897 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: E1004 02:41:38.552487 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.556342 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.558361 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.558415 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.558432 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.558456 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.558475 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.576202 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: E1004 02:41:38.579809 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.586207 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.586256 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.586277 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.586303 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.586320 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.598509 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: E1004 02:41:38.608417 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.613474 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.613711 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.613873 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.614016 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.614144 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.616189 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: E1004 02:41:38.637401 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.639814 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.644606 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.644685 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.644702 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.644725 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.644743 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.658494 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"104ac332-90bc-4a0d-a085-0e47997cfdd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391bbf5e9b0494eb2fb5ab2d016230d876a51bd83eb52e759819dd11d54c1dc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79910f39f896467b97e336bf2d9ddc46fc68ce22e15162a78865049569c03c81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94e58235a34a6260cb38523b1823b73a5083dcff7f3af4167635bc394289072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: E1004 02:41:38.665500 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: E1004 02:41:38.665854 4964 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.668285 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.668375 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.668395 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.668416 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.668431 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.676436 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2d0d43-54d5-4b73-a227-0867264eee27\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87fdd14ec19d65c508672f966db71a585458cbea0a944a6960b56d8e0160eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.697880 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.713719 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.733959 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e79fc7fe0ec6d21d20e93ea04d085d1d23ee8bf1a2c766f100bf1d53b804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:29Z\\\",\\\"message\\\":\\\"2025-10-04T02:40:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b\\\\n2025-10-04T02:40:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b to /host/opt/cni/bin/\\\\n2025-10-04T02:40:44Z [verbose] multus-daemon started\\\\n2025-10-04T02:40:44Z [verbose] Readiness Indicator file check\\\\n2025-10-04T02:41:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.766277 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:36Z\\\",\\\"message\\\":\\\"0:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 02:41:36.806362 6957 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-machine-webhook for network=default\\\\nI1004 02:41:36.804867 6957 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1004 02:41:36.806385 6957 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1004 02:41:36.806008 6957 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.771483 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.771527 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.771543 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.771566 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.771584 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.788518 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.808169 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:38Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.844241 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.844332 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:38 crc kubenswrapper[4964]: E1004 02:41:38.844508 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.844831 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:38 crc kubenswrapper[4964]: E1004 02:41:38.844851 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:38 crc kubenswrapper[4964]: E1004 02:41:38.845071 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.874727 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.874843 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.874862 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.874887 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.874904 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.978203 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.978284 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.978303 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.978328 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:38 crc kubenswrapper[4964]: I1004 02:41:38.978347 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:38Z","lastTransitionTime":"2025-10-04T02:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.081524 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.081670 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.081690 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.081714 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.081730 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:39Z","lastTransitionTime":"2025-10-04T02:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.185140 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.185196 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.185218 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.185247 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.185270 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:39Z","lastTransitionTime":"2025-10-04T02:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.288020 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.288085 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.288102 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.288128 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.288146 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:39Z","lastTransitionTime":"2025-10-04T02:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.391321 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.391411 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.391439 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.391468 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.391489 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:39Z","lastTransitionTime":"2025-10-04T02:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.494498 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.494562 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.494579 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.494605 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.494650 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:39Z","lastTransitionTime":"2025-10-04T02:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.597768 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.597830 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.597851 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.597875 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.597892 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:39Z","lastTransitionTime":"2025-10-04T02:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.701500 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.701559 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.701581 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.701607 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.701654 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:39Z","lastTransitionTime":"2025-10-04T02:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.805251 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.805298 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.805314 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.805337 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.805353 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:39Z","lastTransitionTime":"2025-10-04T02:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.844814 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:39 crc kubenswrapper[4964]: E1004 02:41:39.845013 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.908499 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.908561 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.908578 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.908605 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:39 crc kubenswrapper[4964]: I1004 02:41:39.908668 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:39Z","lastTransitionTime":"2025-10-04T02:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.011319 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.011370 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.011386 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.011412 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.011428 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:40Z","lastTransitionTime":"2025-10-04T02:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.113871 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.113995 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.114013 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.114036 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.114054 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:40Z","lastTransitionTime":"2025-10-04T02:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.217907 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.217946 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.217953 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.217966 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.217976 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:40Z","lastTransitionTime":"2025-10-04T02:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.321197 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.321269 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.321292 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.321322 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.321409 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:40Z","lastTransitionTime":"2025-10-04T02:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.424505 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.424570 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.424589 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.424645 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.424664 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:40Z","lastTransitionTime":"2025-10-04T02:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.527180 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.527229 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.527245 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.527266 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.527282 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:40Z","lastTransitionTime":"2025-10-04T02:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.629921 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.630268 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.630284 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.630307 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.630325 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:40Z","lastTransitionTime":"2025-10-04T02:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.733375 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.733432 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.733448 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.733472 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.733489 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:40Z","lastTransitionTime":"2025-10-04T02:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.835959 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.836015 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.836031 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.836056 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.836072 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:40Z","lastTransitionTime":"2025-10-04T02:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.844874 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.844887 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.844947 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:40 crc kubenswrapper[4964]: E1004 02:41:40.846155 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:40 crc kubenswrapper[4964]: E1004 02:41:40.846374 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:40 crc kubenswrapper[4964]: E1004 02:41:40.846457 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.866282 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.873303 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:40Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.890456 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:40Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.907363 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:40Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.929403 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:40Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.938735 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.938781 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.938798 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.938824 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.938842 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:40Z","lastTransitionTime":"2025-10-04T02:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.949985 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:40Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.972368 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:40Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:40 crc kubenswrapper[4964]: I1004 02:41:40.991007 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:40Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.007296 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:41Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.020329 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:41Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.034045 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:41Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.042169 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.042219 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.042234 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.042256 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.042272 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:41Z","lastTransitionTime":"2025-10-04T02:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.053916 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:36Z\\\",\\\"message\\\":\\\"0:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 02:41:36.806362 6957 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-machine-webhook for network=default\\\\nI1004 02:41:36.804867 6957 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1004 02:41:36.806385 6957 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1004 02:41:36.806008 6957 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:41Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.069219 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:41Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.086704 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"104ac332-90bc-4a0d-a085-0e47997cfdd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391bbf5e9b0494eb2fb5ab2d016230d876a51bd83eb52e759819dd11d54c1dc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79910f39f896467b97e336bf2d9ddc46fc68ce22e15162a78865049569c03c81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94e58235a34a6260cb38523b1823b73a5083dcff7f3af4167635bc394289072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:41Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.101305 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2d0d43-54d5-4b73-a227-0867264eee27\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87fdd14ec19d65c508672f966db71a585458cbea0a944a6960b56d8e0160eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:41Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.118776 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:41Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.135245 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:41Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.151031 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.151116 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.151141 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.151689 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.151723 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:41Z","lastTransitionTime":"2025-10-04T02:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.162660 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e79fc7fe0ec6d21d20e93ea04d085d1d23ee8bf1a2c766f100bf1d53b804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:29Z\\\",\\\"message\\\":\\\"2025-10-04T02:40:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b\\\\n2025-10-04T02:40:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b to /host/opt/cni/bin/\\\\n2025-10-04T02:40:44Z [verbose] multus-daemon started\\\\n2025-10-04T02:40:44Z [verbose] Readiness Indicator file check\\\\n2025-10-04T02:41:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:41Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.182807 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:41Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.255686 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.255750 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.255769 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.255792 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.255810 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:41Z","lastTransitionTime":"2025-10-04T02:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.359373 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.359465 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.359530 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.359568 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.359592 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:41Z","lastTransitionTime":"2025-10-04T02:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.462807 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.462851 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.462861 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.462877 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.462890 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:41Z","lastTransitionTime":"2025-10-04T02:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.566849 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.566919 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.566936 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.566963 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.566981 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:41Z","lastTransitionTime":"2025-10-04T02:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.669665 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.669726 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.669746 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.669770 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.669788 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:41Z","lastTransitionTime":"2025-10-04T02:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.772846 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.772908 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.772925 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.772949 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.772968 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:41Z","lastTransitionTime":"2025-10-04T02:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.844661 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:41 crc kubenswrapper[4964]: E1004 02:41:41.844887 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.876026 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.876084 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.876101 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.876126 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.876142 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:41Z","lastTransitionTime":"2025-10-04T02:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.979947 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.980017 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.980035 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.980060 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:41 crc kubenswrapper[4964]: I1004 02:41:41.980080 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:41Z","lastTransitionTime":"2025-10-04T02:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.083659 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.083725 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.083741 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.083766 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.083785 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:42Z","lastTransitionTime":"2025-10-04T02:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.186819 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.186878 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.186897 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.186922 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.186942 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:42Z","lastTransitionTime":"2025-10-04T02:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.289570 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.289685 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.289713 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.289742 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.289768 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:42Z","lastTransitionTime":"2025-10-04T02:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.393304 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.393381 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.393408 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.393439 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.393465 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:42Z","lastTransitionTime":"2025-10-04T02:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.497373 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.497433 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.497449 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.497473 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.497491 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:42Z","lastTransitionTime":"2025-10-04T02:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.600776 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.600843 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.600866 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.600896 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.600922 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:42Z","lastTransitionTime":"2025-10-04T02:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.703833 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.703911 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.703940 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.703975 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.704003 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:42Z","lastTransitionTime":"2025-10-04T02:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.807018 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.807106 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.807130 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.807160 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.807183 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:42Z","lastTransitionTime":"2025-10-04T02:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.846153 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.846223 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:42 crc kubenswrapper[4964]: E1004 02:41:42.846381 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.846482 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:42 crc kubenswrapper[4964]: E1004 02:41:42.846649 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:42 crc kubenswrapper[4964]: E1004 02:41:42.847267 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.909416 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.909469 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.909486 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.909508 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:42 crc kubenswrapper[4964]: I1004 02:41:42.909525 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:42Z","lastTransitionTime":"2025-10-04T02:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.020905 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.020970 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.021263 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.021289 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.021307 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:43Z","lastTransitionTime":"2025-10-04T02:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.124557 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.124662 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.124682 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.124707 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.124725 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:43Z","lastTransitionTime":"2025-10-04T02:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.227979 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.228013 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.228023 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.228039 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.228053 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:43Z","lastTransitionTime":"2025-10-04T02:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.331706 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.331774 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.331794 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.331820 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.331851 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:43Z","lastTransitionTime":"2025-10-04T02:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.435844 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.435908 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.435924 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.435948 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.435967 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:43Z","lastTransitionTime":"2025-10-04T02:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.538325 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.538370 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.538386 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.538408 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.538425 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:43Z","lastTransitionTime":"2025-10-04T02:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.641191 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.641274 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.641299 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.641334 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.641358 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:43Z","lastTransitionTime":"2025-10-04T02:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.744024 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.744127 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.744146 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.744204 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.744222 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:43Z","lastTransitionTime":"2025-10-04T02:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.845044 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:43 crc kubenswrapper[4964]: E1004 02:41:43.845237 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.847593 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.847710 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.847737 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.847780 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.847810 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:43Z","lastTransitionTime":"2025-10-04T02:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.950875 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.950940 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.950958 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.950987 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:43 crc kubenswrapper[4964]: I1004 02:41:43.951007 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:43Z","lastTransitionTime":"2025-10-04T02:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.054301 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.054370 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.054386 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.054414 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.054435 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:44Z","lastTransitionTime":"2025-10-04T02:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.157943 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.157999 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.158020 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.158050 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.158072 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:44Z","lastTransitionTime":"2025-10-04T02:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.263133 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.263213 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.263237 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.263270 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.263293 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:44Z","lastTransitionTime":"2025-10-04T02:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.366694 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.366760 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.366777 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.366802 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.366820 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:44Z","lastTransitionTime":"2025-10-04T02:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.469177 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.469235 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.469252 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.469276 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.469293 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:44Z","lastTransitionTime":"2025-10-04T02:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.574724 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.574806 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.574828 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.574858 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.574879 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:44Z","lastTransitionTime":"2025-10-04T02:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.677900 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.677993 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.678016 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.678043 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.678063 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:44Z","lastTransitionTime":"2025-10-04T02:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.687046 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.687237 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.687256 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.687219489 +0000 UTC m=+148.584178167 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.687365 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.687464 4964 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.687549 4964 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.687598 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.68756003 +0000 UTC m=+148.584518708 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.687674 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.687647762 +0000 UTC m=+148.584606460 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.780973 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.781029 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.781046 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.781068 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.781083 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:44Z","lastTransitionTime":"2025-10-04T02:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.788943 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.789205 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.789326 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.789382 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.789396 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.789406 4964 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.789424 4964 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.789443 4964 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.789503 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.789475853 +0000 UTC m=+148.686434531 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.789538 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.789522454 +0000 UTC m=+148.686481122 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.844995 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.845011 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.845227 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.845798 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.846038 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:44 crc kubenswrapper[4964]: E1004 02:41:44.846187 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.884274 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.884363 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.884396 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.884427 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.884448 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:44Z","lastTransitionTime":"2025-10-04T02:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.987958 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.988019 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.988036 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.988057 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:44 crc kubenswrapper[4964]: I1004 02:41:44.988074 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:44Z","lastTransitionTime":"2025-10-04T02:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.091144 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.091265 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.091288 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.091314 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.091331 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:45Z","lastTransitionTime":"2025-10-04T02:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.194226 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.194307 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.194331 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.194360 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.194379 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:45Z","lastTransitionTime":"2025-10-04T02:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.298374 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.298428 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.298444 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.298468 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.298485 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:45Z","lastTransitionTime":"2025-10-04T02:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.401468 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.401547 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.401600 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.401653 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.401672 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:45Z","lastTransitionTime":"2025-10-04T02:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.504670 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.504742 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.505106 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.505143 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.505165 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:45Z","lastTransitionTime":"2025-10-04T02:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.608524 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.608588 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.608855 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.608949 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.608974 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:45Z","lastTransitionTime":"2025-10-04T02:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.712175 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.712238 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.712253 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.712275 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.712293 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:45Z","lastTransitionTime":"2025-10-04T02:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.814835 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.814926 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.814943 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.814966 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.814984 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:45Z","lastTransitionTime":"2025-10-04T02:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.845193 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:45 crc kubenswrapper[4964]: E1004 02:41:45.845609 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.918795 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.918858 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.918882 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.918910 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:45 crc kubenswrapper[4964]: I1004 02:41:45.918931 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:45Z","lastTransitionTime":"2025-10-04T02:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.022239 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.022305 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.022325 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.022348 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.022364 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:46Z","lastTransitionTime":"2025-10-04T02:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.126246 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.126309 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.126326 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.126350 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.126369 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:46Z","lastTransitionTime":"2025-10-04T02:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.229933 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.229994 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.230013 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.230038 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.230054 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:46Z","lastTransitionTime":"2025-10-04T02:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.333240 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.333278 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.333287 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.333302 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.333312 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:46Z","lastTransitionTime":"2025-10-04T02:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.435668 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.435698 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.435706 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.435719 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.435750 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:46Z","lastTransitionTime":"2025-10-04T02:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.539144 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.539179 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.539188 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.539202 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.539212 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:46Z","lastTransitionTime":"2025-10-04T02:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.642451 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.642584 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.642602 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.642657 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.642684 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:46Z","lastTransitionTime":"2025-10-04T02:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.745855 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.746136 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.746154 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.746182 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.746199 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:46Z","lastTransitionTime":"2025-10-04T02:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.844790 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.844862 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:46 crc kubenswrapper[4964]: E1004 02:41:46.845001 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.845064 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:46 crc kubenswrapper[4964]: E1004 02:41:46.845310 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:46 crc kubenswrapper[4964]: E1004 02:41:46.845502 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.849383 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.849437 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.849454 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.849476 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.849493 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:46Z","lastTransitionTime":"2025-10-04T02:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.952730 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.952800 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.952823 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.952894 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:46 crc kubenswrapper[4964]: I1004 02:41:46.952919 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:46Z","lastTransitionTime":"2025-10-04T02:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.055940 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.056001 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.056024 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.056080 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.056106 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:47Z","lastTransitionTime":"2025-10-04T02:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.158773 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.158835 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.158851 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.158876 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.158894 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:47Z","lastTransitionTime":"2025-10-04T02:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.262009 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.262073 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.262091 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.262114 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.262136 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:47Z","lastTransitionTime":"2025-10-04T02:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.364833 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.364884 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.364910 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.364941 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.364964 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:47Z","lastTransitionTime":"2025-10-04T02:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.467360 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.467403 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.467415 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.467431 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.467442 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:47Z","lastTransitionTime":"2025-10-04T02:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.569997 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.570070 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.570103 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.570136 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.570164 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:47Z","lastTransitionTime":"2025-10-04T02:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.673729 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.673776 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.673793 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.673817 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.673834 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:47Z","lastTransitionTime":"2025-10-04T02:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.777037 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.777114 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.777137 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.777169 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.777196 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:47Z","lastTransitionTime":"2025-10-04T02:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.844446 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:47 crc kubenswrapper[4964]: E1004 02:41:47.844673 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.880241 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.880297 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.880312 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.880336 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.880355 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:47Z","lastTransitionTime":"2025-10-04T02:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.982753 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.982838 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.982862 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.982887 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:47 crc kubenswrapper[4964]: I1004 02:41:47.982905 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:47Z","lastTransitionTime":"2025-10-04T02:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.086096 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.086162 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.086182 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.086208 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.086229 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.194965 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.195057 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.195081 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.195113 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.195141 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.298584 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.298679 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.298696 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.298721 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.298738 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.401853 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.401926 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.401949 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.401973 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.401989 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.504945 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.505007 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.505028 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.505057 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.505081 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.607457 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.607516 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.607537 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.607566 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.607586 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.689383 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.689444 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.689533 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.689568 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.689590 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: E1004 02:41:48.709567 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.714601 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.714680 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.714698 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.714723 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.714740 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: E1004 02:41:48.736118 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.741204 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.741265 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.741285 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.741310 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.741327 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: E1004 02:41:48.789701 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.794526 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.794586 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.794604 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.794653 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.794672 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: E1004 02:41:48.813596 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.817772 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.817819 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.817833 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.817858 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.817875 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: E1004 02:41:48.835638 4964 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5f4745f6-8127-4980-be1d-1af4770a22e1\\\",\\\"systemUUID\\\":\\\"b0d50052-71be-478d-b81f-25c1a6e2025f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:48Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:48 crc kubenswrapper[4964]: E1004 02:41:48.835778 4964 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.837435 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.837470 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.837487 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.837510 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.837525 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.845103 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.845146 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.845103 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:48 crc kubenswrapper[4964]: E1004 02:41:48.845212 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:48 crc kubenswrapper[4964]: E1004 02:41:48.845360 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:48 crc kubenswrapper[4964]: E1004 02:41:48.845521 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.941106 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.941170 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.941188 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.941213 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:48 crc kubenswrapper[4964]: I1004 02:41:48.941230 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:48Z","lastTransitionTime":"2025-10-04T02:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.044453 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.044521 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.044539 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.044563 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.044586 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:49Z","lastTransitionTime":"2025-10-04T02:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.147921 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.148056 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.148075 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.148103 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.148153 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:49Z","lastTransitionTime":"2025-10-04T02:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.251275 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.251338 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.251354 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.251377 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.251396 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:49Z","lastTransitionTime":"2025-10-04T02:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.354400 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.354483 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.354511 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.354544 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.354567 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:49Z","lastTransitionTime":"2025-10-04T02:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.457511 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.457588 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.457606 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.457666 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.457686 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:49Z","lastTransitionTime":"2025-10-04T02:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.561148 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.561203 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.561222 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.561245 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.561263 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:49Z","lastTransitionTime":"2025-10-04T02:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.664005 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.664086 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.664111 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.664141 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.664163 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:49Z","lastTransitionTime":"2025-10-04T02:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.766831 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.766901 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.766918 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.766942 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.766961 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:49Z","lastTransitionTime":"2025-10-04T02:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.845196 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:49 crc kubenswrapper[4964]: E1004 02:41:49.845755 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.869809 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.869874 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.869896 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.869923 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.869945 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:49Z","lastTransitionTime":"2025-10-04T02:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.973163 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.973236 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.973258 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.973287 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:49 crc kubenswrapper[4964]: I1004 02:41:49.973309 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:49Z","lastTransitionTime":"2025-10-04T02:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.076222 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.076287 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.076310 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.076338 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.076358 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:50Z","lastTransitionTime":"2025-10-04T02:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.178977 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.179048 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.179066 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.179091 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.179107 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:50Z","lastTransitionTime":"2025-10-04T02:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.282586 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.282700 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.282718 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.282743 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.282760 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:50Z","lastTransitionTime":"2025-10-04T02:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.385750 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.385815 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.385833 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.385859 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.385876 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:50Z","lastTransitionTime":"2025-10-04T02:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.488666 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.488733 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.488757 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.488785 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.488805 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:50Z","lastTransitionTime":"2025-10-04T02:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.590918 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.590968 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.590990 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.591019 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.591043 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:50Z","lastTransitionTime":"2025-10-04T02:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.693785 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.693843 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.693861 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.693885 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.693901 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:50Z","lastTransitionTime":"2025-10-04T02:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.796215 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.796349 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.796372 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.796398 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.796416 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:50Z","lastTransitionTime":"2025-10-04T02:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.844492 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.844501 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.844560 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:50 crc kubenswrapper[4964]: E1004 02:41:50.844839 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:50 crc kubenswrapper[4964]: E1004 02:41:50.844934 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:50 crc kubenswrapper[4964]: E1004 02:41:50.845001 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.861183 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.881914 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-prcqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e36b0a0d-d6be-4917-a161-26245a74904a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7209e219ab4e8086cc96e5204ca3546f45f8bcc885ac1f78d64a0c5b1ce42f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb73cb80a521922e0b82ec632f4672e702221b4b8ff4ded4ded6f037a1d59a45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22d59ae289b968c8ccd95b591ed0b2830009774839c12fbfcd54b602fc91b06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74072df45e43ac83308b78c54dab6197fe206658f6f625dfa35f85a793472bd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdb2869080f9b5ba80b4dc2805f024a7dfbb32bd4bd674c81719d0135f6ce95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8cab30fc12e88c9b306c5bcb54803eebf744e8abac511b926628560ef0286b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9765386e061ce64f3c08e3fe0375134e64f7b562bc03206c9fe9ab0896aadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nn56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-prcqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.896972 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f1c9150-b444-41bb-9233-d76c4765a2d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nbzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xrr6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.899872 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.899953 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.899970 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.900030 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.900048 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:50Z","lastTransitionTime":"2025-10-04T02:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.916274 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a059170e-5cc7-4829-ad18-edfff34cd293\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06207e167a65f6fec31fc6cadb74eb65b2a94a2c2a6c6c2639cd5bc464e0643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5925adb67628a0d9af7e9aa4f3e762c015cb1a81da9ab1cea553988033ef1957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://894fdfa7b99b05262e015989b85dbe8fa45e84a039e3036b87c27d3979e9549b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4014486019dc91b62cb770bc75d12284eca0104ae5e998992ab7f52dd557418\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.949188 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9a2eae5-39bd-4d1e-a22a-9d2b844f427d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://413f9903ad851ef827bc418226f805580239a49bee5e331f6c5f3738ef34c3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://811f1cdfa9acfdae7c1a2cf93d39d272575ef104314e3ce20e6d9cd0decda3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c887b45983e03e922d4c6667c3095d5bbd78e1b43c7d570dd688ea4ccb94153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d9927624ca800063b9c628444d84e1ff3cb447d4291d00921c8306e3a8ad5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf54c8e3239be0eaaf2f90f7dd70463f5e21a956be327a7976fae4ea96b0530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://616b42c86af7b399da4a366796eaeb929444273db3b02f728e283bdfe28c3a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://616b42c86af7b399da4a366796eaeb929444273db3b02f728e283bdfe28c3a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89608ab72333c7613b66fecfe9f379e00c9eedc77bd9b8f7d7cc6594959f080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89608ab72333c7613b66fecfe9f379e00c9eedc77bd9b8f7d7cc6594959f080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1e907616bad5b8360e9b953786a3c8878ddf6ff319ca5a1ed32feec227a5e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e907616bad5b8360e9b953786a3c8878ddf6ff319ca5a1ed32feec227a5e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.975696 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38a4353a3e8b90e3bb878a69f4deccd2eeb92516a0a0ea578d9365e7b4830620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:50 crc kubenswrapper[4964]: I1004 02:41:50.993440 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:50Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.003715 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.003792 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.003811 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.003837 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.003857 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:51Z","lastTransitionTime":"2025-10-04T02:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.007675 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d408f9e00674c823994cc8617d0f5d3202df34d83d385f314be19cb15842870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.022711 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.035040 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c02c3c-a484-46f9-a96d-8650b8f9c67f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c5015bd013e2af93d5ee47c8c93e06df4a11d0f3db62bb3e216c1f2adf13cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znwzc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m7mv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.047206 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnp9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12ce3232-4729-4910-9890-a3da4586342c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03fc6466f916cf8b7f2de038f344069ddb21bbbb8f1bbccbaafa3b93ddddf420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5bjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnp9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.063611 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9bbac90-6a71-48f5-8524-799b00786492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff468d9448ef613a85881d9023b84c050cecb1fca32e7594ebcba4927edc97a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afaf3789213dd1df6139727f1ff4e0f1f40185a4d42bf60b150ab24224758eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pldrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9vc9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.085560 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:36Z\\\",\\\"message\\\":\\\"0:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 02:41:36.806362 6957 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-machine-webhook for network=default\\\\nI1004 02:41:36.804867 6957 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1004 02:41:36.806385 6957 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1004 02:41:36.806008 6957 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:41:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqkx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xrs78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.107484 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.107558 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.107577 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.107603 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.107651 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:51Z","lastTransitionTime":"2025-10-04T02:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.107396 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff3935d8-3fa0-4ba5-8ff3-e3bc12428bee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a2b0ef98124ffcaf531b29ebaf479327fabd7c821c04fdb13d4d66effe9e9d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa4e92b86e0e3f57ee08422f774ad8ea314d582c4f8350271c4ebb504e22c49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96edf525176683a35c89b3f03516466efb8e1a2384870665a0643b64ad776c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb90018f026c84e23f4765c6c684a1aa5ad4a2b4146f216fc74e4afc06e789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f76c0e8c541bd77d5385a90714af5e253d3b72aab363c8e06744831d74bb106\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T02:40:40Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1004 02:40:34.785913 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1004 02:40:34.788178 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713597179/tls.crt::/tmp/serving-cert-713597179/tls.key\\\\\\\"\\\\nI1004 02:40:40.551806 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1004 02:40:40.557671 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1004 02:40:40.557701 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1004 02:40:40.557740 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1004 02:40:40.557750 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1004 02:40:40.568579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1004 02:40:40.568655 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1004 02:40:40.568669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1004 02:40:40.568675 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1004 02:40:40.568684 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1004 02:40:40.568727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1004 02:40:40.568736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1004 02:40:40.568742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1004 02:40:40.573639 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db843ba6901f81dd905ea51e0823a7186567d2f754181869e0a2507d7d97a4b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013688df13e385d4557009aa73565873cd4151e0ded6d3aa18448b06014d9d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.124117 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"104ac332-90bc-4a0d-a085-0e47997cfdd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391bbf5e9b0494eb2fb5ab2d016230d876a51bd83eb52e759819dd11d54c1dc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79910f39f896467b97e336bf2d9ddc46fc68ce22e15162a78865049569c03c81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a94e58235a34a6260cb38523b1823b73a5083dcff7f3af4167635bc394289072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52659798f15697621d58691065d1c98a31b99eb197452fd5e8866aba8d11bfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.139825 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2d0d43-54d5-4b73-a227-0867264eee27\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87fdd14ec19d65c508672f966db71a585458cbea0a944a6960b56d8e0160eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91cf31c64d2ab8ebdade8bd99f7b6aba2bafb4ddd7cc111dd75a4288fd62de7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T02:40:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.157661 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca780576cfdfbb6fb18b7b98f02cbbd6161016a12f2556f046680a5a523ccdf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://117a731cdeb058b3798c5ad8d65d04fd4403358b3088f50624f1b5e4ddb3e7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.172298 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w556r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf670ba7-2bcb-4d80-b655-289c47e35cf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ee6bc9c47c132caedbb73b7ff87e441d1fd454525ae1ec655b26ac15a7b8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dvpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w556r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.191508 4964 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6hm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10ea848d-0322-476d-976d-4ae3ac39910b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T02:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e79fc7fe0ec6d21d20e93ea04d085d1d23ee8bf1a2c766f100bf1d53b804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T02:41:29Z\\\",\\\"message\\\":\\\"2025-10-04T02:40:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b\\\\n2025-10-04T02:40:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ad2cfc69-357c-4d2c-99b8-696fc88f2c4b to /host/opt/cni/bin/\\\\n2025-10-04T02:40:44Z [verbose] multus-daemon started\\\\n2025-10-04T02:40:44Z [verbose] Readiness Indicator file check\\\\n2025-10-04T02:41:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T02:40:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T02:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fsxpm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T02:40:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6hm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T02:41:51Z is after 2025-08-24T17:21:41Z" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.211105 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.211170 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.211190 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.211219 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.211239 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:51Z","lastTransitionTime":"2025-10-04T02:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.314405 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.314475 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.314493 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.314517 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.314533 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:51Z","lastTransitionTime":"2025-10-04T02:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.419142 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.419250 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.419282 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.419316 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.419355 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:51Z","lastTransitionTime":"2025-10-04T02:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.522583 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.522688 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.522739 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.522764 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.522781 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:51Z","lastTransitionTime":"2025-10-04T02:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.626108 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.626165 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.626182 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.626203 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.626220 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:51Z","lastTransitionTime":"2025-10-04T02:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.729908 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.729995 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.730018 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.730048 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.730073 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:51Z","lastTransitionTime":"2025-10-04T02:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.832899 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.832987 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.833015 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.833049 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.833074 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:51Z","lastTransitionTime":"2025-10-04T02:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.844560 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:51 crc kubenswrapper[4964]: E1004 02:41:51.844784 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.936186 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.936240 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.936257 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.936283 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:51 crc kubenswrapper[4964]: I1004 02:41:51.936304 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:51Z","lastTransitionTime":"2025-10-04T02:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.038674 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.038745 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.038762 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.038787 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.038804 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:52Z","lastTransitionTime":"2025-10-04T02:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.141237 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.141297 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.141315 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.141337 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.141356 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:52Z","lastTransitionTime":"2025-10-04T02:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.243967 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.244031 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.244050 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.244076 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.244096 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:52Z","lastTransitionTime":"2025-10-04T02:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.347395 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.347443 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.347459 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.347482 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.347499 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:52Z","lastTransitionTime":"2025-10-04T02:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.449894 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.449944 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.449959 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.449979 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.449996 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:52Z","lastTransitionTime":"2025-10-04T02:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.552448 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.552515 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.552539 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.552566 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.552589 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:52Z","lastTransitionTime":"2025-10-04T02:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.655483 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.655543 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.655561 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.655585 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.655602 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:52Z","lastTransitionTime":"2025-10-04T02:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.759837 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.759900 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.759917 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.759942 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.759959 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:52Z","lastTransitionTime":"2025-10-04T02:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.844994 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.845948 4964 scope.go:117] "RemoveContainer" containerID="fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b" Oct 04 02:41:52 crc kubenswrapper[4964]: E1004 02:41:52.846239 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.846490 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.846536 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:52 crc kubenswrapper[4964]: E1004 02:41:52.846669 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:52 crc kubenswrapper[4964]: E1004 02:41:52.846856 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:52 crc kubenswrapper[4964]: E1004 02:41:52.847073 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.863751 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.863807 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.863824 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.863847 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.863867 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:52Z","lastTransitionTime":"2025-10-04T02:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.966403 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.966461 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.966479 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.966506 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:52 crc kubenswrapper[4964]: I1004 02:41:52.966524 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:52Z","lastTransitionTime":"2025-10-04T02:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.069664 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.069733 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.069750 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.069775 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.069793 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:53Z","lastTransitionTime":"2025-10-04T02:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.173404 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.173462 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.173480 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.173503 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.173520 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:53Z","lastTransitionTime":"2025-10-04T02:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.277542 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.277599 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.277645 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.277671 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.277691 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:53Z","lastTransitionTime":"2025-10-04T02:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.380793 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.380872 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.380892 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.380923 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.380949 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:53Z","lastTransitionTime":"2025-10-04T02:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.484836 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.485277 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.485426 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.485566 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.485763 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:53Z","lastTransitionTime":"2025-10-04T02:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.589328 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.589390 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.589407 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.589430 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.589452 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:53Z","lastTransitionTime":"2025-10-04T02:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.692395 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.692452 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.692469 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.692495 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.692515 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:53Z","lastTransitionTime":"2025-10-04T02:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.795590 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.796046 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.796182 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.796312 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.796450 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:53Z","lastTransitionTime":"2025-10-04T02:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.844441 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:53 crc kubenswrapper[4964]: E1004 02:41:53.844922 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.899697 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.899772 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.899790 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.899815 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:53 crc kubenswrapper[4964]: I1004 02:41:53.899833 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:53Z","lastTransitionTime":"2025-10-04T02:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.002605 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.003231 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.003322 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.003406 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.003500 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:54Z","lastTransitionTime":"2025-10-04T02:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.106938 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.107346 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.107522 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.107695 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.107828 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:54Z","lastTransitionTime":"2025-10-04T02:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.211269 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.211339 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.211357 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.211384 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.211403 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:54Z","lastTransitionTime":"2025-10-04T02:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.314386 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.314443 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.314460 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.314483 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.314500 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:54Z","lastTransitionTime":"2025-10-04T02:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.417011 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.417098 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.417118 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.417147 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.417165 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:54Z","lastTransitionTime":"2025-10-04T02:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.520435 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.520496 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.520516 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.520540 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.520558 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:54Z","lastTransitionTime":"2025-10-04T02:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.623116 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.623149 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.623159 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.623175 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.623186 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:54Z","lastTransitionTime":"2025-10-04T02:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.725459 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.725501 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.725511 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.725528 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.725540 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:54Z","lastTransitionTime":"2025-10-04T02:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.828947 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.828990 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.829001 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.829016 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.829027 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:54Z","lastTransitionTime":"2025-10-04T02:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.845013 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.845040 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:54 crc kubenswrapper[4964]: E1004 02:41:54.845128 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.845014 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:54 crc kubenswrapper[4964]: E1004 02:41:54.845453 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:54 crc kubenswrapper[4964]: E1004 02:41:54.845683 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.932020 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.932080 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.932097 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.932122 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:54 crc kubenswrapper[4964]: I1004 02:41:54.932140 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:54Z","lastTransitionTime":"2025-10-04T02:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.036230 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.036399 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.036427 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.036459 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.036480 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:55Z","lastTransitionTime":"2025-10-04T02:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.139064 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.139135 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.139160 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.139186 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.139205 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:55Z","lastTransitionTime":"2025-10-04T02:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.242204 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.242261 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.242276 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.242299 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.242316 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:55Z","lastTransitionTime":"2025-10-04T02:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.345771 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.345845 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.345865 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.345888 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.345908 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:55Z","lastTransitionTime":"2025-10-04T02:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.449365 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.449430 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.449453 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.449482 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.449504 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:55Z","lastTransitionTime":"2025-10-04T02:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.552040 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.552082 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.552090 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.552102 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.552111 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:55Z","lastTransitionTime":"2025-10-04T02:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.654445 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.654515 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.654530 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.654551 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.654566 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:55Z","lastTransitionTime":"2025-10-04T02:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.758254 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.758311 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.758322 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.758363 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.758374 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:55Z","lastTransitionTime":"2025-10-04T02:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.844706 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:55 crc kubenswrapper[4964]: E1004 02:41:55.844932 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.860958 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.861003 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.861021 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.861043 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.861061 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:55Z","lastTransitionTime":"2025-10-04T02:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.963897 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.963973 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.963994 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.964021 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:55 crc kubenswrapper[4964]: I1004 02:41:55.964040 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:55Z","lastTransitionTime":"2025-10-04T02:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.067685 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.067769 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.067788 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.067813 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.067830 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:56Z","lastTransitionTime":"2025-10-04T02:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.170352 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.170405 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.170421 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.170443 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.170458 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:56Z","lastTransitionTime":"2025-10-04T02:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.273446 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.273491 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.273502 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.273518 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.273529 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:56Z","lastTransitionTime":"2025-10-04T02:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.377088 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.377147 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.377165 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.377190 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.377208 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:56Z","lastTransitionTime":"2025-10-04T02:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.480093 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.480158 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.480183 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.480212 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.480234 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:56Z","lastTransitionTime":"2025-10-04T02:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.583825 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.583889 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.583906 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.583944 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.583961 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:56Z","lastTransitionTime":"2025-10-04T02:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.687479 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.687522 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.687537 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.687559 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.687576 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:56Z","lastTransitionTime":"2025-10-04T02:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.790079 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.790139 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.790162 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.790193 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.790213 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:56Z","lastTransitionTime":"2025-10-04T02:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.844806 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.844911 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:56 crc kubenswrapper[4964]: E1004 02:41:56.844998 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.845055 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:56 crc kubenswrapper[4964]: E1004 02:41:56.845201 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:56 crc kubenswrapper[4964]: E1004 02:41:56.845326 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.893693 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.893756 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.893773 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.893799 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.893828 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:56Z","lastTransitionTime":"2025-10-04T02:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.997134 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.997219 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.997243 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.997271 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:56 crc kubenswrapper[4964]: I1004 02:41:56.997294 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:56Z","lastTransitionTime":"2025-10-04T02:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.100593 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.100707 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.100732 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.100761 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.100784 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:57Z","lastTransitionTime":"2025-10-04T02:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.204437 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.204506 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.204530 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.204556 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.204573 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:57Z","lastTransitionTime":"2025-10-04T02:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.307358 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.307426 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.307452 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.307481 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.307503 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:57Z","lastTransitionTime":"2025-10-04T02:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.410787 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.410853 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.410874 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.410902 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.410928 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:57Z","lastTransitionTime":"2025-10-04T02:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.514218 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.514281 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.514297 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.514321 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.514338 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:57Z","lastTransitionTime":"2025-10-04T02:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.617421 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.617491 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.617510 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.617542 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.617561 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:57Z","lastTransitionTime":"2025-10-04T02:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.720140 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.720207 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.720227 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.720252 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.720270 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:57Z","lastTransitionTime":"2025-10-04T02:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.823769 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.823833 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.823859 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.823885 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.823902 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:57Z","lastTransitionTime":"2025-10-04T02:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.844271 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:57 crc kubenswrapper[4964]: E1004 02:41:57.844440 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.926376 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.926448 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.926473 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.926504 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:57 crc kubenswrapper[4964]: I1004 02:41:57.926526 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:57Z","lastTransitionTime":"2025-10-04T02:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.030662 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.030784 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.030808 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.030839 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.030864 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:58Z","lastTransitionTime":"2025-10-04T02:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.134489 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.134565 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.134608 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.134689 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.134710 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:58Z","lastTransitionTime":"2025-10-04T02:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.238099 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.238164 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.238186 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.238216 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.238237 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:58Z","lastTransitionTime":"2025-10-04T02:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.341294 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.341348 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.341364 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.341387 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.341405 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:58Z","lastTransitionTime":"2025-10-04T02:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.446722 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.446801 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.446821 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.446851 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.446870 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:58Z","lastTransitionTime":"2025-10-04T02:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.549737 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.549793 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.549809 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.549831 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.549848 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:58Z","lastTransitionTime":"2025-10-04T02:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.653144 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.653218 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.653237 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.653262 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.653279 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:58Z","lastTransitionTime":"2025-10-04T02:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.756186 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.756246 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.756262 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.756285 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.756303 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:58Z","lastTransitionTime":"2025-10-04T02:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.844561 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.844684 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.844577 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:41:58 crc kubenswrapper[4964]: E1004 02:41:58.844781 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:41:58 crc kubenswrapper[4964]: E1004 02:41:58.844885 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:41:58 crc kubenswrapper[4964]: E1004 02:41:58.844993 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.859154 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.859193 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.859203 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.859219 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.859232 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:58Z","lastTransitionTime":"2025-10-04T02:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.860306 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.860364 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.860387 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.860416 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.860441 4964 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T02:41:58Z","lastTransitionTime":"2025-10-04T02:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.922212 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl"] Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.922785 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.925530 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.925998 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.926478 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 04 02:41:58 crc kubenswrapper[4964]: I1004 02:41:58.926492 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.006127 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=19.006080436 podStartE2EDuration="19.006080436s" podCreationTimestamp="2025-10-04 02:41:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:41:59.002076996 +0000 UTC m=+98.899035644" watchObservedRunningTime="2025-10-04 02:41:59.006080436 +0000 UTC m=+98.903039074" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.053814 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.053892 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.053924 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.053974 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.054010 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.093031 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-prcqh" podStartSLOduration=78.093003898 podStartE2EDuration="1m18.093003898s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:41:59.092933836 +0000 UTC m=+98.989892474" watchObservedRunningTime="2025-10-04 02:41:59.093003898 +0000 UTC m=+98.989962576" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.123379 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.12335428 podStartE2EDuration="1m15.12335428s" podCreationTimestamp="2025-10-04 02:40:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:41:59.122170305 +0000 UTC m=+99.019128943" watchObservedRunningTime="2025-10-04 02:41:59.12335428 +0000 UTC m=+99.020312948" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.137496 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bnp9l" podStartSLOduration=78.137477115 podStartE2EDuration="1m18.137477115s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:41:59.136680811 +0000 UTC m=+99.033639459" watchObservedRunningTime="2025-10-04 02:41:59.137477115 +0000 UTC m=+99.034435793" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.155763 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.155854 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.155938 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.156023 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.156117 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.156227 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.156269 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.156800 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.177096 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.177986 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9vc9h" podStartSLOduration=78.177971652 podStartE2EDuration="1m18.177971652s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:41:59.158779875 +0000 UTC m=+99.055738533" watchObservedRunningTime="2025-10-04 02:41:59.177971652 +0000 UTC m=+99.074930300" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.178578 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podStartSLOduration=78.17857158 podStartE2EDuration="1m18.17857158s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:41:59.17857296 +0000 UTC m=+99.075531638" watchObservedRunningTime="2025-10-04 02:41:59.17857158 +0000 UTC m=+99.075530228" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.188404 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bllkl\" (UID: \"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.197140 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=47.197120718 podStartE2EDuration="47.197120718s" podCreationTimestamp="2025-10-04 02:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:41:59.196474187 +0000 UTC m=+99.093432855" watchObservedRunningTime="2025-10-04 02:41:59.197120718 +0000 UTC m=+99.094079366" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.213179 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=30.213156219 podStartE2EDuration="30.213156219s" podCreationTimestamp="2025-10-04 02:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:41:59.212665824 +0000 UTC m=+99.109624472" watchObservedRunningTime="2025-10-04 02:41:59.213156219 +0000 UTC m=+99.110114897" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.251080 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.257354 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w556r" podStartSLOduration=78.257323016 podStartE2EDuration="1m18.257323016s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:41:59.24878101 +0000 UTC m=+99.145739678" watchObservedRunningTime="2025-10-04 02:41:59.257323016 +0000 UTC m=+99.154281674" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.296842 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-q6hm8" podStartSLOduration=78.296814883 podStartE2EDuration="1m18.296814883s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:41:59.296524134 +0000 UTC m=+99.193482812" watchObservedRunningTime="2025-10-04 02:41:59.296814883 +0000 UTC m=+99.193773551" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.344435 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.344390343 podStartE2EDuration="1m18.344390343s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:41:59.342901109 +0000 UTC m=+99.239859797" watchObservedRunningTime="2025-10-04 02:41:59.344390343 +0000 UTC m=+99.241348981" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.507769 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" event={"ID":"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476","Type":"ContainerStarted","Data":"22662e3a000a75691cd7efe34c2e516a4360d565b41c95097995eafc003f279f"} Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.508156 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" event={"ID":"ad4eaf8b-ecb8-4d2c-8d05-c7b2296cb476","Type":"ContainerStarted","Data":"450f78a3b58a5c7a3fd9bde364ac43161e826b3890b2d81f4226c0b7f1fcfa77"} Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.532787 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bllkl" podStartSLOduration=78.532766064 podStartE2EDuration="1m18.532766064s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:41:59.531816975 +0000 UTC m=+99.428775673" watchObservedRunningTime="2025-10-04 02:41:59.532766064 +0000 UTC m=+99.429724702" Oct 04 02:41:59 crc kubenswrapper[4964]: I1004 02:41:59.844818 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:41:59 crc kubenswrapper[4964]: E1004 02:41:59.845019 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:00 crc kubenswrapper[4964]: I1004 02:42:00.844550 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:00 crc kubenswrapper[4964]: I1004 02:42:00.844663 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:00 crc kubenswrapper[4964]: E1004 02:42:00.846513 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:00 crc kubenswrapper[4964]: I1004 02:42:00.846795 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:00 crc kubenswrapper[4964]: E1004 02:42:00.846805 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:00 crc kubenswrapper[4964]: E1004 02:42:00.847791 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:00 crc kubenswrapper[4964]: I1004 02:42:00.876786 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:00 crc kubenswrapper[4964]: E1004 02:42:00.876954 4964 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:42:00 crc kubenswrapper[4964]: E1004 02:42:00.877152 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs podName:7f1c9150-b444-41bb-9233-d76c4765a2d0 nodeName:}" failed. No retries permitted until 2025-10-04 02:43:04.877128114 +0000 UTC m=+164.774086792 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs") pod "network-metrics-daemon-xrr6r" (UID: "7f1c9150-b444-41bb-9233-d76c4765a2d0") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 02:42:01 crc kubenswrapper[4964]: I1004 02:42:01.845109 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:01 crc kubenswrapper[4964]: E1004 02:42:01.845300 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:02 crc kubenswrapper[4964]: I1004 02:42:02.844891 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:02 crc kubenswrapper[4964]: I1004 02:42:02.844939 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:02 crc kubenswrapper[4964]: I1004 02:42:02.845023 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:02 crc kubenswrapper[4964]: E1004 02:42:02.845079 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:02 crc kubenswrapper[4964]: E1004 02:42:02.845356 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:02 crc kubenswrapper[4964]: E1004 02:42:02.845539 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:03 crc kubenswrapper[4964]: I1004 02:42:03.845205 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:03 crc kubenswrapper[4964]: E1004 02:42:03.845721 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:04 crc kubenswrapper[4964]: I1004 02:42:04.844742 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:04 crc kubenswrapper[4964]: I1004 02:42:04.844770 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:04 crc kubenswrapper[4964]: E1004 02:42:04.844963 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:04 crc kubenswrapper[4964]: E1004 02:42:04.845045 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:04 crc kubenswrapper[4964]: I1004 02:42:04.844770 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:04 crc kubenswrapper[4964]: E1004 02:42:04.845770 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:05 crc kubenswrapper[4964]: I1004 02:42:05.845177 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:05 crc kubenswrapper[4964]: E1004 02:42:05.845997 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:06 crc kubenswrapper[4964]: I1004 02:42:06.844183 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:06 crc kubenswrapper[4964]: E1004 02:42:06.844330 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:06 crc kubenswrapper[4964]: I1004 02:42:06.844369 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:06 crc kubenswrapper[4964]: I1004 02:42:06.844441 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:06 crc kubenswrapper[4964]: E1004 02:42:06.844489 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:06 crc kubenswrapper[4964]: E1004 02:42:06.844669 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:07 crc kubenswrapper[4964]: I1004 02:42:07.844996 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:07 crc kubenswrapper[4964]: E1004 02:42:07.845234 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:07 crc kubenswrapper[4964]: I1004 02:42:07.846513 4964 scope.go:117] "RemoveContainer" containerID="fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b" Oct 04 02:42:07 crc kubenswrapper[4964]: E1004 02:42:07.846855 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xrs78_openshift-ovn-kubernetes(74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" Oct 04 02:42:08 crc kubenswrapper[4964]: I1004 02:42:08.845067 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:08 crc kubenswrapper[4964]: I1004 02:42:08.845214 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:08 crc kubenswrapper[4964]: I1004 02:42:08.845432 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:08 crc kubenswrapper[4964]: E1004 02:42:08.845591 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:08 crc kubenswrapper[4964]: E1004 02:42:08.845818 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:08 crc kubenswrapper[4964]: E1004 02:42:08.846036 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:09 crc kubenswrapper[4964]: I1004 02:42:09.845010 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:09 crc kubenswrapper[4964]: E1004 02:42:09.845239 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:10 crc kubenswrapper[4964]: I1004 02:42:10.844532 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:10 crc kubenswrapper[4964]: E1004 02:42:10.844786 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:10 crc kubenswrapper[4964]: I1004 02:42:10.844902 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:10 crc kubenswrapper[4964]: I1004 02:42:10.844979 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:10 crc kubenswrapper[4964]: E1004 02:42:10.847014 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:10 crc kubenswrapper[4964]: E1004 02:42:10.847201 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:11 crc kubenswrapper[4964]: I1004 02:42:11.844943 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:11 crc kubenswrapper[4964]: E1004 02:42:11.845131 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:12 crc kubenswrapper[4964]: I1004 02:42:12.845706 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:12 crc kubenswrapper[4964]: I1004 02:42:12.845822 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:12 crc kubenswrapper[4964]: E1004 02:42:12.846028 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:12 crc kubenswrapper[4964]: I1004 02:42:12.846098 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:12 crc kubenswrapper[4964]: E1004 02:42:12.846285 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:12 crc kubenswrapper[4964]: E1004 02:42:12.846370 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:13 crc kubenswrapper[4964]: I1004 02:42:13.844658 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:13 crc kubenswrapper[4964]: E1004 02:42:13.844842 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:14 crc kubenswrapper[4964]: I1004 02:42:14.844509 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:14 crc kubenswrapper[4964]: I1004 02:42:14.844518 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:14 crc kubenswrapper[4964]: I1004 02:42:14.844656 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:14 crc kubenswrapper[4964]: E1004 02:42:14.844818 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:14 crc kubenswrapper[4964]: E1004 02:42:14.844917 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:14 crc kubenswrapper[4964]: E1004 02:42:14.845143 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:15 crc kubenswrapper[4964]: I1004 02:42:15.845279 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:15 crc kubenswrapper[4964]: E1004 02:42:15.845692 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:16 crc kubenswrapper[4964]: I1004 02:42:16.575804 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q6hm8_10ea848d-0322-476d-976d-4ae3ac39910b/kube-multus/1.log" Oct 04 02:42:16 crc kubenswrapper[4964]: I1004 02:42:16.576477 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q6hm8_10ea848d-0322-476d-976d-4ae3ac39910b/kube-multus/0.log" Oct 04 02:42:16 crc kubenswrapper[4964]: I1004 02:42:16.576545 4964 generic.go:334] "Generic (PLEG): container finished" podID="10ea848d-0322-476d-976d-4ae3ac39910b" containerID="a79e79fc7fe0ec6d21d20e93ea04d085d1d23ee8bf1a2c766f100bf1d53b804d" exitCode=1 Oct 04 02:42:16 crc kubenswrapper[4964]: I1004 02:42:16.576591 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q6hm8" event={"ID":"10ea848d-0322-476d-976d-4ae3ac39910b","Type":"ContainerDied","Data":"a79e79fc7fe0ec6d21d20e93ea04d085d1d23ee8bf1a2c766f100bf1d53b804d"} Oct 04 02:42:16 crc kubenswrapper[4964]: I1004 02:42:16.576685 4964 scope.go:117] "RemoveContainer" containerID="b9b01add5e2c8d81b9b89e99d2fde454f727be0e0705cb89ef43740c53034709" Oct 04 02:42:16 crc kubenswrapper[4964]: I1004 02:42:16.577288 4964 scope.go:117] "RemoveContainer" containerID="a79e79fc7fe0ec6d21d20e93ea04d085d1d23ee8bf1a2c766f100bf1d53b804d" Oct 04 02:42:16 crc kubenswrapper[4964]: E1004 02:42:16.579058 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-q6hm8_openshift-multus(10ea848d-0322-476d-976d-4ae3ac39910b)\"" pod="openshift-multus/multus-q6hm8" podUID="10ea848d-0322-476d-976d-4ae3ac39910b" Oct 04 02:42:16 crc kubenswrapper[4964]: I1004 02:42:16.845896 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:16 crc kubenswrapper[4964]: I1004 02:42:16.845925 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:16 crc kubenswrapper[4964]: E1004 02:42:16.846366 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:16 crc kubenswrapper[4964]: E1004 02:42:16.846512 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:16 crc kubenswrapper[4964]: I1004 02:42:16.845995 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:16 crc kubenswrapper[4964]: E1004 02:42:16.846667 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:17 crc kubenswrapper[4964]: I1004 02:42:17.581712 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q6hm8_10ea848d-0322-476d-976d-4ae3ac39910b/kube-multus/1.log" Oct 04 02:42:17 crc kubenswrapper[4964]: I1004 02:42:17.844796 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:17 crc kubenswrapper[4964]: E1004 02:42:17.844995 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:18 crc kubenswrapper[4964]: I1004 02:42:18.844276 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:18 crc kubenswrapper[4964]: I1004 02:42:18.844365 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:18 crc kubenswrapper[4964]: E1004 02:42:18.844441 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:18 crc kubenswrapper[4964]: E1004 02:42:18.844542 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:18 crc kubenswrapper[4964]: I1004 02:42:18.844688 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:18 crc kubenswrapper[4964]: E1004 02:42:18.844786 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:19 crc kubenswrapper[4964]: I1004 02:42:19.844550 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:19 crc kubenswrapper[4964]: E1004 02:42:19.844786 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:20 crc kubenswrapper[4964]: I1004 02:42:20.845081 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:20 crc kubenswrapper[4964]: I1004 02:42:20.855359 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:20 crc kubenswrapper[4964]: I1004 02:42:20.855393 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:20 crc kubenswrapper[4964]: E1004 02:42:20.855666 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:20 crc kubenswrapper[4964]: E1004 02:42:20.855831 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:20 crc kubenswrapper[4964]: E1004 02:42:20.857902 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:20 crc kubenswrapper[4964]: I1004 02:42:20.858198 4964 scope.go:117] "RemoveContainer" containerID="fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b" Oct 04 02:42:20 crc kubenswrapper[4964]: E1004 02:42:20.877098 4964 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 04 02:42:20 crc kubenswrapper[4964]: E1004 02:42:20.968672 4964 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 04 02:42:21 crc kubenswrapper[4964]: I1004 02:42:21.599924 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/3.log" Oct 04 02:42:21 crc kubenswrapper[4964]: I1004 02:42:21.603416 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerStarted","Data":"34418f62ccb97659c09eb2365f742b6446c010e8e64820cc506fd20baa60f0e0"} Oct 04 02:42:21 crc kubenswrapper[4964]: I1004 02:42:21.604347 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:42:21 crc kubenswrapper[4964]: I1004 02:42:21.844890 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:21 crc kubenswrapper[4964]: E1004 02:42:21.845042 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:22 crc kubenswrapper[4964]: I1004 02:42:22.020416 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podStartSLOduration=101.020346871 podStartE2EDuration="1m41.020346871s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:21.653030723 +0000 UTC m=+121.549989371" watchObservedRunningTime="2025-10-04 02:42:22.020346871 +0000 UTC m=+121.917305559" Oct 04 02:42:22 crc kubenswrapper[4964]: I1004 02:42:22.022393 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xrr6r"] Oct 04 02:42:22 crc kubenswrapper[4964]: I1004 02:42:22.022554 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:22 crc kubenswrapper[4964]: E1004 02:42:22.022800 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:22 crc kubenswrapper[4964]: I1004 02:42:22.844430 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:22 crc kubenswrapper[4964]: I1004 02:42:22.844484 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:22 crc kubenswrapper[4964]: E1004 02:42:22.844669 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:22 crc kubenswrapper[4964]: E1004 02:42:22.844760 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:23 crc kubenswrapper[4964]: I1004 02:42:23.845138 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:23 crc kubenswrapper[4964]: I1004 02:42:23.845195 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:23 crc kubenswrapper[4964]: E1004 02:42:23.845321 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:23 crc kubenswrapper[4964]: E1004 02:42:23.845468 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:24 crc kubenswrapper[4964]: I1004 02:42:24.844913 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:24 crc kubenswrapper[4964]: E1004 02:42:24.845070 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:24 crc kubenswrapper[4964]: I1004 02:42:24.845189 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:24 crc kubenswrapper[4964]: E1004 02:42:24.845375 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:25 crc kubenswrapper[4964]: I1004 02:42:25.844726 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:25 crc kubenswrapper[4964]: E1004 02:42:25.844921 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:25 crc kubenswrapper[4964]: I1004 02:42:25.844734 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:25 crc kubenswrapper[4964]: E1004 02:42:25.845060 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:25 crc kubenswrapper[4964]: E1004 02:42:25.970673 4964 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 04 02:42:26 crc kubenswrapper[4964]: I1004 02:42:26.844990 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:26 crc kubenswrapper[4964]: E1004 02:42:26.845105 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:26 crc kubenswrapper[4964]: I1004 02:42:26.845254 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:26 crc kubenswrapper[4964]: E1004 02:42:26.845830 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:27 crc kubenswrapper[4964]: I1004 02:42:27.844322 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:27 crc kubenswrapper[4964]: I1004 02:42:27.844396 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:27 crc kubenswrapper[4964]: E1004 02:42:27.844528 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:27 crc kubenswrapper[4964]: E1004 02:42:27.844707 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:28 crc kubenswrapper[4964]: I1004 02:42:28.844310 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:28 crc kubenswrapper[4964]: I1004 02:42:28.844416 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:28 crc kubenswrapper[4964]: E1004 02:42:28.844558 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:28 crc kubenswrapper[4964]: E1004 02:42:28.845813 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:29 crc kubenswrapper[4964]: I1004 02:42:29.845117 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:29 crc kubenswrapper[4964]: I1004 02:42:29.845166 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:29 crc kubenswrapper[4964]: E1004 02:42:29.845227 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:29 crc kubenswrapper[4964]: E1004 02:42:29.845476 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:29 crc kubenswrapper[4964]: I1004 02:42:29.845906 4964 scope.go:117] "RemoveContainer" containerID="a79e79fc7fe0ec6d21d20e93ea04d085d1d23ee8bf1a2c766f100bf1d53b804d" Oct 04 02:42:30 crc kubenswrapper[4964]: I1004 02:42:30.638446 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q6hm8_10ea848d-0322-476d-976d-4ae3ac39910b/kube-multus/1.log" Oct 04 02:42:30 crc kubenswrapper[4964]: I1004 02:42:30.638848 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q6hm8" event={"ID":"10ea848d-0322-476d-976d-4ae3ac39910b","Type":"ContainerStarted","Data":"b667cd3fd52cb198a428e96a085b12b34b610116ecc8aab4a77964917a4d4c6c"} Oct 04 02:42:30 crc kubenswrapper[4964]: I1004 02:42:30.845176 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:30 crc kubenswrapper[4964]: I1004 02:42:30.845184 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:30 crc kubenswrapper[4964]: E1004 02:42:30.847792 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:30 crc kubenswrapper[4964]: E1004 02:42:30.847988 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:30 crc kubenswrapper[4964]: E1004 02:42:30.971435 4964 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 04 02:42:31 crc kubenswrapper[4964]: I1004 02:42:31.844930 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:31 crc kubenswrapper[4964]: E1004 02:42:31.845125 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:31 crc kubenswrapper[4964]: I1004 02:42:31.844926 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:31 crc kubenswrapper[4964]: E1004 02:42:31.845590 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:32 crc kubenswrapper[4964]: I1004 02:42:32.844582 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:32 crc kubenswrapper[4964]: I1004 02:42:32.844592 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:32 crc kubenswrapper[4964]: E1004 02:42:32.844858 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:32 crc kubenswrapper[4964]: E1004 02:42:32.844939 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:33 crc kubenswrapper[4964]: I1004 02:42:33.844360 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:33 crc kubenswrapper[4964]: E1004 02:42:33.844553 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:33 crc kubenswrapper[4964]: I1004 02:42:33.844850 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:33 crc kubenswrapper[4964]: E1004 02:42:33.845046 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:34 crc kubenswrapper[4964]: I1004 02:42:34.844687 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:34 crc kubenswrapper[4964]: E1004 02:42:34.844798 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 02:42:34 crc kubenswrapper[4964]: I1004 02:42:34.844687 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:34 crc kubenswrapper[4964]: E1004 02:42:34.845111 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 02:42:35 crc kubenswrapper[4964]: I1004 02:42:35.845122 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:35 crc kubenswrapper[4964]: I1004 02:42:35.845128 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:35 crc kubenswrapper[4964]: E1004 02:42:35.845281 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 02:42:35 crc kubenswrapper[4964]: E1004 02:42:35.845358 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrr6r" podUID="7f1c9150-b444-41bb-9233-d76c4765a2d0" Oct 04 02:42:36 crc kubenswrapper[4964]: I1004 02:42:36.844358 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:36 crc kubenswrapper[4964]: I1004 02:42:36.844404 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:36 crc kubenswrapper[4964]: I1004 02:42:36.846542 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 04 02:42:36 crc kubenswrapper[4964]: I1004 02:42:36.847277 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 04 02:42:37 crc kubenswrapper[4964]: I1004 02:42:37.845128 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:37 crc kubenswrapper[4964]: I1004 02:42:37.845183 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:42:37 crc kubenswrapper[4964]: I1004 02:42:37.848137 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 04 02:42:37 crc kubenswrapper[4964]: I1004 02:42:37.848209 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 04 02:42:37 crc kubenswrapper[4964]: I1004 02:42:37.848209 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 04 02:42:37 crc kubenswrapper[4964]: I1004 02:42:37.848958 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.674313 4964 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.726910 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8dqhm"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.727871 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nr92k"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.728157 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.728609 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.737588 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b4x88"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.738358 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.739977 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-brgcm"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.740927 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748278 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z56l\" (UniqueName: \"kubernetes.io/projected/1a40013f-a275-4839-8f62-09488dbf53a3-kube-api-access-2z56l\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748340 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-config\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748375 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5rml\" (UniqueName: \"kubernetes.io/projected/cc0ac95b-a7a9-4b23-a073-99146acc645d-kube-api-access-w5rml\") pod \"machine-api-operator-5694c8668f-b4x88\" (UID: \"cc0ac95b-a7a9-4b23-a073-99146acc645d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748411 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a40013f-a275-4839-8f62-09488dbf53a3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748480 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a40013f-a275-4839-8f62-09488dbf53a3-service-ca-bundle\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748568 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc0ac95b-a7a9-4b23-a073-99146acc645d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b4x88\" (UID: \"cc0ac95b-a7a9-4b23-a073-99146acc645d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748662 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8c9da68e-d45d-44c9-be51-0b8a38042692-node-pullsecrets\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748704 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748766 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a40013f-a275-4839-8f62-09488dbf53a3-config\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748802 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c9da68e-d45d-44c9-be51-0b8a38042692-etcd-client\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748849 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qczgs\" (UniqueName: \"kubernetes.io/projected/8c9da68e-d45d-44c9-be51-0b8a38042692-kube-api-access-qczgs\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748883 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c9da68e-d45d-44c9-be51-0b8a38042692-serving-cert\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748915 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.748987 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cc0ac95b-a7a9-4b23-a073-99146acc645d-images\") pod \"machine-api-operator-5694c8668f-b4x88\" (UID: \"cc0ac95b-a7a9-4b23-a073-99146acc645d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.749075 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a40013f-a275-4839-8f62-09488dbf53a3-serving-cert\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.749163 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-audit\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.749199 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c9da68e-d45d-44c9-be51-0b8a38042692-encryption-config\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.749234 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0ac95b-a7a9-4b23-a073-99146acc645d-config\") pod \"machine-api-operator-5694c8668f-b4x88\" (UID: \"cc0ac95b-a7a9-4b23-a073-99146acc645d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.749322 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da04224f-997b-4890-b0c8-2bf983b1f21d-serving-cert\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.749384 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-image-import-ca\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.749420 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-client-ca\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.749458 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448bq\" (UniqueName: \"kubernetes.io/projected/da04224f-997b-4890-b0c8-2bf983b1f21d-kube-api-access-448bq\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.749503 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-config\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.749551 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c9da68e-d45d-44c9-be51-0b8a38042692-audit-dir\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.749608 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-etcd-serving-ca\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.750171 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.753508 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.753605 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.753536 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.753537 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.754117 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.754504 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.754867 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.757072 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.757107 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.757215 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.757260 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.757354 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.757537 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.757551 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.757636 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.758239 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.759306 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.760141 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.765138 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.765155 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.765295 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.765431 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.765503 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.765660 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.765698 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.765714 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.766896 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.767001 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.767135 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.776189 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.777268 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.778154 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.778239 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.779668 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.781273 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.790332 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.790560 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.790705 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.792455 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.792689 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.792964 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.793110 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.793032 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zdrf"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.793874 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.794815 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.794852 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.795465 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.796114 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.796286 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.797842 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.797982 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.799025 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.799386 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.800128 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.800455 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.800830 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.801095 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.801906 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.801923 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.801998 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.802455 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.803736 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.804502 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.805812 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.806280 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.806720 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.806927 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.807131 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.807338 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.807907 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.808039 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.810197 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.810391 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.810575 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.811037 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.811040 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-xgvpt"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.811696 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.811762 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xgvpt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.811938 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.812417 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.812480 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.812568 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.812828 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.812775 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.814163 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.815289 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.819554 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.821107 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.822142 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.822877 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.823071 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.823366 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.823772 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.823906 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.823911 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.823957 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.823076 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.825209 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.827410 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.827783 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-cznct"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.845444 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.854774 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.855790 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.856258 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.866650 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.866729 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-etcd-serving-ca\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.866768 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z56l\" (UniqueName: \"kubernetes.io/projected/1a40013f-a275-4839-8f62-09488dbf53a3-kube-api-access-2z56l\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.866795 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-config\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.866818 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5rml\" (UniqueName: \"kubernetes.io/projected/cc0ac95b-a7a9-4b23-a073-99146acc645d-kube-api-access-w5rml\") pod \"machine-api-operator-5694c8668f-b4x88\" (UID: \"cc0ac95b-a7a9-4b23-a073-99146acc645d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.866841 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a40013f-a275-4839-8f62-09488dbf53a3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.866859 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.866877 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a40013f-a275-4839-8f62-09488dbf53a3-service-ca-bundle\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.866908 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc0ac95b-a7a9-4b23-a073-99146acc645d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b4x88\" (UID: \"cc0ac95b-a7a9-4b23-a073-99146acc645d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.866931 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8c9da68e-d45d-44c9-be51-0b8a38042692-node-pullsecrets\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.866952 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.866980 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a40013f-a275-4839-8f62-09488dbf53a3-config\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867000 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c9da68e-d45d-44c9-be51-0b8a38042692-etcd-client\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867010 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867030 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qczgs\" (UniqueName: \"kubernetes.io/projected/8c9da68e-d45d-44c9-be51-0b8a38042692-kube-api-access-qczgs\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867050 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c9da68e-d45d-44c9-be51-0b8a38042692-serving-cert\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867068 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867089 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cc0ac95b-a7a9-4b23-a073-99146acc645d-images\") pod \"machine-api-operator-5694c8668f-b4x88\" (UID: \"cc0ac95b-a7a9-4b23-a073-99146acc645d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867112 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a40013f-a275-4839-8f62-09488dbf53a3-serving-cert\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867140 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-audit\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867160 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c9da68e-d45d-44c9-be51-0b8a38042692-encryption-config\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867183 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0ac95b-a7a9-4b23-a073-99146acc645d-config\") pod \"machine-api-operator-5694c8668f-b4x88\" (UID: \"cc0ac95b-a7a9-4b23-a073-99146acc645d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867212 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da04224f-997b-4890-b0c8-2bf983b1f21d-serving-cert\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867235 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-image-import-ca\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867259 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-client-ca\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867281 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-448bq\" (UniqueName: \"kubernetes.io/projected/da04224f-997b-4890-b0c8-2bf983b1f21d-kube-api-access-448bq\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867305 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-config\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867327 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c9da68e-d45d-44c9-be51-0b8a38042692-audit-dir\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867395 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c9da68e-d45d-44c9-be51-0b8a38042692-audit-dir\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.867922 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-etcd-serving-ca\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.868110 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-config\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.868367 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.868495 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4zll7"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.868949 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.869146 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cc0ac95b-a7a9-4b23-a073-99146acc645d-images\") pod \"machine-api-operator-5694c8668f-b4x88\" (UID: \"cc0ac95b-a7a9-4b23-a073-99146acc645d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.869601 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a40013f-a275-4839-8f62-09488dbf53a3-service-ca-bundle\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.869630 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a40013f-a275-4839-8f62-09488dbf53a3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.869685 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8c9da68e-d45d-44c9-be51-0b8a38042692-node-pullsecrets\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.870527 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.871330 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sjkbq"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.871926 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.873321 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wppjk"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.873847 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.873402 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a40013f-a275-4839-8f62-09488dbf53a3-config\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.879819 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.880560 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.880910 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-khlb2"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.881063 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c9da68e-d45d-44c9-be51-0b8a38042692-serving-cert\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.881370 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.884975 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c9da68e-d45d-44c9-be51-0b8a38042692-etcd-client\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.881735 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.882819 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0ac95b-a7a9-4b23-a073-99146acc645d-config\") pod \"machine-api-operator-5694c8668f-b4x88\" (UID: \"cc0ac95b-a7a9-4b23-a073-99146acc645d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.883053 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.883702 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-image-import-ca\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.883702 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.885322 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.883336 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.881504 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8c9da68e-d45d-44c9-be51-0b8a38042692-audit\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.883377 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.885807 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pg8r8"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.883433 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.886191 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a40013f-a275-4839-8f62-09488dbf53a3-serving-cert\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.883632 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.883732 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.886485 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.886393 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-config\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.886519 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pg8r8" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.883864 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.883952 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.886700 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.886893 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.887141 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.887362 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.887464 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.887641 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.891304 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.891818 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.892206 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.892884 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.893771 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da04224f-997b-4890-b0c8-2bf983b1f21d-serving-cert\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.893855 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nr92k"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.894457 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dp6lk"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.896805 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c9da68e-d45d-44c9-be51-0b8a38042692-encryption-config\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.898168 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.898456 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5rml\" (UniqueName: \"kubernetes.io/projected/cc0ac95b-a7a9-4b23-a073-99146acc645d-kube-api-access-w5rml\") pod \"machine-api-operator-5694c8668f-b4x88\" (UID: \"cc0ac95b-a7a9-4b23-a073-99146acc645d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.899435 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z56l\" (UniqueName: \"kubernetes.io/projected/1a40013f-a275-4839-8f62-09488dbf53a3-kube-api-access-2z56l\") pod \"authentication-operator-69f744f599-brgcm\" (UID: \"1a40013f-a275-4839-8f62-09488dbf53a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.900780 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.901646 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.901725 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dp6lk" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.902054 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.902323 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.902446 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.902581 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.902771 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.903529 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.903629 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b4x88"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.903657 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.903716 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.904074 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-client-ca\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.907151 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc0ac95b-a7a9-4b23-a073-99146acc645d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b4x88\" (UID: \"cc0ac95b-a7a9-4b23-a073-99146acc645d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.910712 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.911005 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.912309 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.914280 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-j54j2"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.915033 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d78cz"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.915355 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.919332 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-brgcm"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.919436 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.929822 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.930875 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.931284 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.937231 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4crxp"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.939489 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.940046 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.940209 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.941530 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vng4p"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.941935 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.940211 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.942459 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.942405 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wrktt"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.942882 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.944393 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.946068 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8dqhm"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.946183 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wrktt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.947761 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xgvpt"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.950036 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.952218 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4zll7"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.953472 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-cznct"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.955056 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.956881 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.959538 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.961009 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.962803 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.963567 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dp6lk"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.964671 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zdrf"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.965766 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sjkbq"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.967969 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968350 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968376 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968399 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-trusted-ca-bundle\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968428 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-service-ca\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968445 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968462 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/38588578-4883-453a-8a92-1bef1dc0f479-auth-proxy-config\") pod \"machine-approver-56656f9798-vb4qk\" (UID: \"38588578-4883-453a-8a92-1bef1dc0f479\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968476 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/38588578-4883-453a-8a92-1bef1dc0f479-machine-approver-tls\") pod \"machine-approver-56656f9798-vb4qk\" (UID: \"38588578-4883-453a-8a92-1bef1dc0f479\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968501 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-oauth-serving-cert\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968520 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7b419e3-b339-4a82-8cb1-c14467712c1f-serving-cert\") pod \"route-controller-manager-6576b87f9c-rm5rh\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968534 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zn49\" (UniqueName: \"kubernetes.io/projected/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-kube-api-access-2zn49\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968549 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptpt7\" (UniqueName: \"kubernetes.io/projected/2e961f5c-6d04-462b-9097-346efdfe347c-kube-api-access-ptpt7\") pod \"downloads-7954f5f757-xgvpt\" (UID: \"2e961f5c-6d04-462b-9097-346efdfe347c\") " pod="openshift-console/downloads-7954f5f757-xgvpt" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968572 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968589 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968604 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/108d1f0d-1354-4b1b-a3c3-fa5b14cee77f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lfrsh\" (UID: \"108d1f0d-1354-4b1b-a3c3-fa5b14cee77f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968635 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bc77816-30a0-4d2c-8817-67fa89c39d35-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s9qmm\" (UID: \"9bc77816-30a0-4d2c-8817-67fa89c39d35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968707 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-etcd-client\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968736 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968760 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkfr9\" (UniqueName: \"kubernetes.io/projected/b564efb8-50db-4d84-a456-4857001ab84a-kube-api-access-vkfr9\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968885 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7tc4\" (UniqueName: \"kubernetes.io/projected/c7b419e3-b339-4a82-8cb1-c14467712c1f-kube-api-access-j7tc4\") pod \"route-controller-manager-6576b87f9c-rm5rh\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.968903 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-serving-cert\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.969004 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7b419e3-b339-4a82-8cb1-c14467712c1f-client-ca\") pod \"route-controller-manager-6576b87f9c-rm5rh\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.969092 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.969192 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-audit-dir\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.969261 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38588578-4883-453a-8a92-1bef1dc0f479-config\") pod \"machine-approver-56656f9798-vb4qk\" (UID: \"38588578-4883-453a-8a92-1bef1dc0f479\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.969345 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d10fd86-53ea-445e-bdf2-83e8caf22d02-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hwzjn\" (UID: \"8d10fd86-53ea-445e-bdf2-83e8caf22d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.969444 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffh5b\" (UniqueName: \"kubernetes.io/projected/8d10fd86-53ea-445e-bdf2-83e8caf22d02-kube-api-access-ffh5b\") pod \"cluster-image-registry-operator-dc59b4c8b-hwzjn\" (UID: \"8d10fd86-53ea-445e-bdf2-83e8caf22d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.969499 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.969585 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.970807 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-j54j2"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973235 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-encryption-config\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973280 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-serving-cert\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973299 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/108d1f0d-1354-4b1b-a3c3-fa5b14cee77f-serving-cert\") pod \"openshift-config-operator-7777fb866f-lfrsh\" (UID: \"108d1f0d-1354-4b1b-a3c3-fa5b14cee77f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973330 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bc77816-30a0-4d2c-8817-67fa89c39d35-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s9qmm\" (UID: \"9bc77816-30a0-4d2c-8817-67fa89c39d35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973354 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e624ca3f-538e-443a-8f2e-aa64988c9ce4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-br64d\" (UID: \"e624ca3f-538e-443a-8f2e-aa64988c9ce4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973370 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-config\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973388 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973404 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47nvs\" (UniqueName: \"kubernetes.io/projected/9bc77816-30a0-4d2c-8817-67fa89c39d35-kube-api-access-47nvs\") pod \"openshift-controller-manager-operator-756b6f6bc6-s9qmm\" (UID: \"9bc77816-30a0-4d2c-8817-67fa89c39d35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973422 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-audit-policies\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973436 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h95zl\" (UniqueName: \"kubernetes.io/projected/108d1f0d-1354-4b1b-a3c3-fa5b14cee77f-kube-api-access-h95zl\") pod \"openshift-config-operator-7777fb866f-lfrsh\" (UID: \"108d1f0d-1354-4b1b-a3c3-fa5b14cee77f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973453 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-audit-policies\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973466 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b564efb8-50db-4d84-a456-4857001ab84a-audit-dir\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973481 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973496 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tztw\" (UniqueName: \"kubernetes.io/projected/e624ca3f-538e-443a-8f2e-aa64988c9ce4-kube-api-access-5tztw\") pod \"cluster-samples-operator-665b6dd947-br64d\" (UID: \"e624ca3f-538e-443a-8f2e-aa64988c9ce4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973511 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-oauth-config\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973527 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8qxg\" (UniqueName: \"kubernetes.io/projected/d30aed64-b8f7-4028-8dfc-f3661ce1c459-kube-api-access-v8qxg\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973637 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wppjk"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973698 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973719 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a18b874-8ef9-45af-9450-b33fd3c51efd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gmxft\" (UID: \"5a18b874-8ef9-45af-9450-b33fd3c51efd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973758 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a18b874-8ef9-45af-9450-b33fd3c51efd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gmxft\" (UID: \"5a18b874-8ef9-45af-9450-b33fd3c51efd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973821 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d10fd86-53ea-445e-bdf2-83e8caf22d02-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hwzjn\" (UID: \"8d10fd86-53ea-445e-bdf2-83e8caf22d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973882 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973916 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d10fd86-53ea-445e-bdf2-83e8caf22d02-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hwzjn\" (UID: \"8d10fd86-53ea-445e-bdf2-83e8caf22d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973964 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.973984 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7b419e3-b339-4a82-8cb1-c14467712c1f-config\") pod \"route-controller-manager-6576b87f9c-rm5rh\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.974013 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2lrl\" (UniqueName: \"kubernetes.io/projected/38588578-4883-453a-8a92-1bef1dc0f479-kube-api-access-t2lrl\") pod \"machine-approver-56656f9798-vb4qk\" (UID: \"38588578-4883-453a-8a92-1bef1dc0f479\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.974040 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcg8f\" (UniqueName: \"kubernetes.io/projected/5a18b874-8ef9-45af-9450-b33fd3c51efd-kube-api-access-rcg8f\") pod \"openshift-apiserver-operator-796bbdcf4f-gmxft\" (UID: \"5a18b874-8ef9-45af-9450-b33fd3c51efd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.974720 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pg8r8"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.976539 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-448bq\" (UniqueName: \"kubernetes.io/projected/da04224f-997b-4890-b0c8-2bf983b1f21d-kube-api-access-448bq\") pod \"controller-manager-879f6c89f-nr92k\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.977464 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qtbhx"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.978094 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qtbhx" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.979464 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rlnbj"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.980876 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.983239 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qczgs\" (UniqueName: \"kubernetes.io/projected/8c9da68e-d45d-44c9-be51-0b8a38042692-kube-api-access-qczgs\") pod \"apiserver-76f77b778f-8dqhm\" (UID: \"8c9da68e-d45d-44c9-be51-0b8a38042692\") " pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.984046 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.986327 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.987857 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.990037 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.992140 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.993971 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vng4p"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.995227 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd"] Oct 04 02:42:39 crc kubenswrapper[4964]: I1004 02:42:39.998407 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.000510 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.002267 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.004267 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4crxp"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.006180 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wrktt"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.007961 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d78cz"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.010107 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.010316 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.011773 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.012992 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rlnbj"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.014163 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-flq6t"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.021748 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-flq6t"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.021826 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.030151 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.049971 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.051450 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.071497 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.074789 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c16d39c-7067-4368-b4f6-324d9612c4de-config\") pod \"service-ca-operator-777779d784-vng4p\" (UID: \"3c16d39c-7067-4368-b4f6-324d9612c4de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.074834 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/108d1f0d-1354-4b1b-a3c3-fa5b14cee77f-serving-cert\") pod \"openshift-config-operator-7777fb866f-lfrsh\" (UID: \"108d1f0d-1354-4b1b-a3c3-fa5b14cee77f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.074861 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bc77816-30a0-4d2c-8817-67fa89c39d35-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s9qmm\" (UID: \"9bc77816-30a0-4d2c-8817-67fa89c39d35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.074885 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-config-volume\") pod \"collect-profiles-29325750-kkskc\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.074913 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e624ca3f-538e-443a-8f2e-aa64988c9ce4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-br64d\" (UID: \"e624ca3f-538e-443a-8f2e-aa64988c9ce4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.074935 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-config\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.074959 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.074986 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/070aadf4-bcde-4da7-bbac-6937b7e6937f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vk6lb\" (UID: \"070aadf4-bcde-4da7-bbac-6937b7e6937f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.075007 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7wxk\" (UniqueName: \"kubernetes.io/projected/d09e8523-afbc-4f5d-888d-92b350c15f7c-kube-api-access-m7wxk\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkldd\" (UID: \"d09e8523-afbc-4f5d-888d-92b350c15f7c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.075032 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h95zl\" (UniqueName: \"kubernetes.io/projected/108d1f0d-1354-4b1b-a3c3-fa5b14cee77f-kube-api-access-h95zl\") pod \"openshift-config-operator-7777fb866f-lfrsh\" (UID: \"108d1f0d-1354-4b1b-a3c3-fa5b14cee77f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.075053 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78rsj\" (UniqueName: \"kubernetes.io/projected/0f7e7f85-336c-4706-ad26-f218ca626bed-kube-api-access-78rsj\") pod \"packageserver-d55dfcdfc-jbtzt\" (UID: \"0f7e7f85-336c-4706-ad26-f218ca626bed\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.075077 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f377b1b3-e592-418c-af18-6cd9c169d9c5-config\") pod \"console-operator-58897d9998-4zll7\" (UID: \"f377b1b3-e592-418c-af18-6cd9c169d9c5\") " pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.075103 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-audit-policies\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.075219 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.075254 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8qxg\" (UniqueName: \"kubernetes.io/projected/d30aed64-b8f7-4028-8dfc-f3661ce1c459-kube-api-access-v8qxg\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.075281 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-etcd-service-ca\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.075784 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076017 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6px6v\" (UniqueName: \"kubernetes.io/projected/070aadf4-bcde-4da7-bbac-6937b7e6937f-kube-api-access-6px6v\") pod \"kube-storage-version-migrator-operator-b67b599dd-vk6lb\" (UID: \"070aadf4-bcde-4da7-bbac-6937b7e6937f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076083 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076097 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bc77816-30a0-4d2c-8817-67fa89c39d35-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s9qmm\" (UID: \"9bc77816-30a0-4d2c-8817-67fa89c39d35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076150 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-etcd-ca\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076175 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076270 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/375981dd-1af1-4165-9ba8-ef13b77a7477-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gtc9j\" (UID: \"375981dd-1af1-4165-9ba8-ef13b77a7477\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076304 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp895\" (UniqueName: \"kubernetes.io/projected/1fa08c06-83f8-4256-b721-af9b30f9f915-kube-api-access-qp895\") pod \"machine-config-controller-84d6567774-wpssm\" (UID: \"1fa08c06-83f8-4256-b721-af9b30f9f915\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076353 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076381 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4c72\" (UniqueName: \"kubernetes.io/projected/3c16d39c-7067-4368-b4f6-324d9612c4de-kube-api-access-w4c72\") pod \"service-ca-operator-777779d784-vng4p\" (UID: \"3c16d39c-7067-4368-b4f6-324d9612c4de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076410 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcg8f\" (UniqueName: \"kubernetes.io/projected/5a18b874-8ef9-45af-9450-b33fd3c51efd-kube-api-access-rcg8f\") pod \"openshift-apiserver-operator-796bbdcf4f-gmxft\" (UID: \"5a18b874-8ef9-45af-9450-b33fd3c51efd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076436 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzhwk\" (UniqueName: \"kubernetes.io/projected/2c097e98-163f-4ede-90a4-c9aa7318aaa0-kube-api-access-jzhwk\") pod \"olm-operator-6b444d44fb-qhj8g\" (UID: \"2c097e98-163f-4ede-90a4-c9aa7318aaa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076460 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076485 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076513 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/25b0c75e-6790-4622-80d9-d1182608eb38-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-l4jsz\" (UID: \"25b0c75e-6790-4622-80d9-d1182608eb38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076543 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076565 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/38588578-4883-453a-8a92-1bef1dc0f479-auth-proxy-config\") pod \"machine-approver-56656f9798-vb4qk\" (UID: \"38588578-4883-453a-8a92-1bef1dc0f479\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076588 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/38588578-4883-453a-8a92-1bef1dc0f479-machine-approver-tls\") pod \"machine-approver-56656f9798-vb4qk\" (UID: \"38588578-4883-453a-8a92-1bef1dc0f479\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076660 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2c097e98-163f-4ede-90a4-c9aa7318aaa0-srv-cert\") pod \"olm-operator-6b444d44fb-qhj8g\" (UID: \"2c097e98-163f-4ede-90a4-c9aa7318aaa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076721 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zn49\" (UniqueName: \"kubernetes.io/projected/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-kube-api-access-2zn49\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076750 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjxlm\" (UniqueName: \"kubernetes.io/projected/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-kube-api-access-qjxlm\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.076805 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.077405 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-audit-policies\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.077508 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.077658 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/108d1f0d-1354-4b1b-a3c3-fa5b14cee77f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lfrsh\" (UID: \"108d1f0d-1354-4b1b-a3c3-fa5b14cee77f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.077921 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/38588578-4883-453a-8a92-1bef1dc0f479-auth-proxy-config\") pod \"machine-approver-56656f9798-vb4qk\" (UID: \"38588578-4883-453a-8a92-1bef1dc0f479\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.077936 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078012 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bc77816-30a0-4d2c-8817-67fa89c39d35-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s9qmm\" (UID: \"9bc77816-30a0-4d2c-8817-67fa89c39d35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078042 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f377b1b3-e592-418c-af18-6cd9c169d9c5-serving-cert\") pod \"console-operator-58897d9998-4zll7\" (UID: \"f377b1b3-e592-418c-af18-6cd9c169d9c5\") " pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078068 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-etcd-client\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078088 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6018b91d-752d-4b19-9121-705d34195d35-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mtqzr\" (UID: \"6018b91d-752d-4b19-9121-705d34195d35\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078089 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078146 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56bx9\" (UniqueName: \"kubernetes.io/projected/3f005aad-20eb-493e-8b74-fb1cf25030aa-kube-api-access-56bx9\") pod \"migrator-59844c95c7-dp6lk\" (UID: \"3f005aad-20eb-493e-8b74-fb1cf25030aa\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dp6lk" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078179 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0f7e7f85-336c-4706-ad26-f218ca626bed-apiservice-cert\") pod \"packageserver-d55dfcdfc-jbtzt\" (UID: \"0f7e7f85-336c-4706-ad26-f218ca626bed\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078204 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2c097e98-163f-4ede-90a4-c9aa7318aaa0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qhj8g\" (UID: \"2c097e98-163f-4ede-90a4-c9aa7318aaa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078231 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkfr9\" (UniqueName: \"kubernetes.io/projected/b564efb8-50db-4d84-a456-4857001ab84a-kube-api-access-vkfr9\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078255 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-serving-cert\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078283 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7b419e3-b339-4a82-8cb1-c14467712c1f-client-ca\") pod \"route-controller-manager-6576b87f9c-rm5rh\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078314 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-etcd-client\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078303 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-config\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078341 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f7198988-705d-437d-9d57-787fda1d80c7-signing-cabundle\") pod \"service-ca-9c57cc56f-j54j2\" (UID: \"f7198988-705d-437d-9d57-787fda1d80c7\") " pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078366 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-secret-volume\") pod \"collect-profiles-29325750-kkskc\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078387 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5873894-8da0-464d-9a26-adad29928a59-metrics-tls\") pod \"dns-operator-744455d44c-pg8r8\" (UID: \"c5873894-8da0-464d-9a26-adad29928a59\") " pod="openshift-dns-operator/dns-operator-744455d44c-pg8r8" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078422 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078451 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38588578-4883-453a-8a92-1bef1dc0f479-config\") pod \"machine-approver-56656f9798-vb4qk\" (UID: \"38588578-4883-453a-8a92-1bef1dc0f479\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078479 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-config\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078640 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffh5b\" (UniqueName: \"kubernetes.io/projected/8d10fd86-53ea-445e-bdf2-83e8caf22d02-kube-api-access-ffh5b\") pod \"cluster-image-registry-operator-dc59b4c8b-hwzjn\" (UID: \"8d10fd86-53ea-445e-bdf2-83e8caf22d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078693 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/375981dd-1af1-4165-9ba8-ef13b77a7477-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gtc9j\" (UID: \"375981dd-1af1-4165-9ba8-ef13b77a7477\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078722 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078761 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-encryption-config\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078795 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-serving-cert\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.078819 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59e47053-7257-4a23-81d0-c4965cde15bf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-prlpt\" (UID: \"59e47053-7257-4a23-81d0-c4965cde15bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.079108 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38588578-4883-453a-8a92-1bef1dc0f479-config\") pod \"machine-approver-56656f9798-vb4qk\" (UID: \"38588578-4883-453a-8a92-1bef1dc0f479\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.079138 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7b419e3-b339-4a82-8cb1-c14467712c1f-client-ca\") pod \"route-controller-manager-6576b87f9c-rm5rh\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.079441 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d09e8523-afbc-4f5d-888d-92b350c15f7c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkldd\" (UID: \"d09e8523-afbc-4f5d-888d-92b350c15f7c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.079497 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4kgq\" (UniqueName: \"kubernetes.io/projected/c5873894-8da0-464d-9a26-adad29928a59-kube-api-access-t4kgq\") pod \"dns-operator-744455d44c-pg8r8\" (UID: \"c5873894-8da0-464d-9a26-adad29928a59\") " pod="openshift-dns-operator/dns-operator-744455d44c-pg8r8" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.079538 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47nvs\" (UniqueName: \"kubernetes.io/projected/9bc77816-30a0-4d2c-8817-67fa89c39d35-kube-api-access-47nvs\") pod \"openshift-controller-manager-operator-756b6f6bc6-s9qmm\" (UID: \"9bc77816-30a0-4d2c-8817-67fa89c39d35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.079676 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb432ba7-089d-40a7-a0a7-43b3217a2527-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d78cz\" (UID: \"bb432ba7-089d-40a7-a0a7-43b3217a2527\") " pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.079852 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-audit-policies\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.079951 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b564efb8-50db-4d84-a456-4857001ab84a-audit-dir\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080047 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tztw\" (UniqueName: \"kubernetes.io/projected/e624ca3f-538e-443a-8f2e-aa64988c9ce4-kube-api-access-5tztw\") pod \"cluster-samples-operator-665b6dd947-br64d\" (UID: \"e624ca3f-538e-443a-8f2e-aa64988c9ce4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080096 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080116 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-oauth-config\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080146 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqhx9\" (UniqueName: \"kubernetes.io/projected/f7198988-705d-437d-9d57-787fda1d80c7-kube-api-access-fqhx9\") pod \"service-ca-9c57cc56f-j54j2\" (UID: \"f7198988-705d-437d-9d57-787fda1d80c7\") " pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080233 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080242 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e624ca3f-538e-443a-8f2e-aa64988c9ce4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-br64d\" (UID: \"e624ca3f-538e-443a-8f2e-aa64988c9ce4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080309 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b564efb8-50db-4d84-a456-4857001ab84a-audit-dir\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080344 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/070aadf4-bcde-4da7-bbac-6937b7e6937f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vk6lb\" (UID: \"070aadf4-bcde-4da7-bbac-6937b7e6937f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080372 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fa08c06-83f8-4256-b721-af9b30f9f915-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wpssm\" (UID: \"1fa08c06-83f8-4256-b721-af9b30f9f915\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080411 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a18b874-8ef9-45af-9450-b33fd3c51efd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gmxft\" (UID: \"5a18b874-8ef9-45af-9450-b33fd3c51efd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080441 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/108d1f0d-1354-4b1b-a3c3-fa5b14cee77f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lfrsh\" (UID: \"108d1f0d-1354-4b1b-a3c3-fa5b14cee77f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080631 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a18b874-8ef9-45af-9450-b33fd3c51efd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gmxft\" (UID: \"5a18b874-8ef9-45af-9450-b33fd3c51efd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080859 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7cba096c-8054-4ad9-bb23-185f18482afb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4crxp\" (UID: \"7cba096c-8054-4ad9-bb23-185f18482afb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080921 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d10fd86-53ea-445e-bdf2-83e8caf22d02-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hwzjn\" (UID: \"8d10fd86-53ea-445e-bdf2-83e8caf22d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.080969 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb432ba7-089d-40a7-a0a7-43b3217a2527-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d78cz\" (UID: \"bb432ba7-089d-40a7-a0a7-43b3217a2527\") " pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081091 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0f7e7f85-336c-4706-ad26-f218ca626bed-webhook-cert\") pod \"packageserver-d55dfcdfc-jbtzt\" (UID: \"0f7e7f85-336c-4706-ad26-f218ca626bed\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081132 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6018b91d-752d-4b19-9121-705d34195d35-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mtqzr\" (UID: \"6018b91d-752d-4b19-9121-705d34195d35\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081192 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081238 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d10fd86-53ea-445e-bdf2-83e8caf22d02-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hwzjn\" (UID: \"8d10fd86-53ea-445e-bdf2-83e8caf22d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081300 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-audit-policies\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081446 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7b419e3-b339-4a82-8cb1-c14467712c1f-config\") pod \"route-controller-manager-6576b87f9c-rm5rh\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081531 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081683 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2lrl\" (UniqueName: \"kubernetes.io/projected/38588578-4883-453a-8a92-1bef1dc0f479-kube-api-access-t2lrl\") pod \"machine-approver-56656f9798-vb4qk\" (UID: \"38588578-4883-453a-8a92-1bef1dc0f479\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081838 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e47053-7257-4a23-81d0-c4965cde15bf-config\") pod \"kube-apiserver-operator-766d6c64bb-prlpt\" (UID: \"59e47053-7257-4a23-81d0-c4965cde15bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081880 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fa08c06-83f8-4256-b721-af9b30f9f915-proxy-tls\") pod \"machine-config-controller-84d6567774-wpssm\" (UID: \"1fa08c06-83f8-4256-b721-af9b30f9f915\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081906 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-serving-cert\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081939 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-trusted-ca-bundle\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081963 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-service-ca\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.081986 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6018b91d-752d-4b19-9121-705d34195d35-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mtqzr\" (UID: \"6018b91d-752d-4b19-9121-705d34195d35\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082011 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e47053-7257-4a23-81d0-c4965cde15bf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-prlpt\" (UID: \"59e47053-7257-4a23-81d0-c4965cde15bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082032 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/38588578-4883-453a-8a92-1bef1dc0f479-machine-approver-tls\") pod \"machine-approver-56656f9798-vb4qk\" (UID: \"38588578-4883-453a-8a92-1bef1dc0f479\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082038 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f7198988-705d-437d-9d57-787fda1d80c7-signing-key\") pod \"service-ca-9c57cc56f-j54j2\" (UID: \"f7198988-705d-437d-9d57-787fda1d80c7\") " pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082100 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmq6\" (UniqueName: \"kubernetes.io/projected/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-kube-api-access-vhmq6\") pod \"collect-profiles-29325750-kkskc\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082130 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375981dd-1af1-4165-9ba8-ef13b77a7477-config\") pod \"kube-controller-manager-operator-78b949d7b-gtc9j\" (UID: \"375981dd-1af1-4165-9ba8-ef13b77a7477\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082013 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-serving-cert\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082161 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-oauth-serving-cert\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082189 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7b419e3-b339-4a82-8cb1-c14467712c1f-serving-cert\") pod \"route-controller-manager-6576b87f9c-rm5rh\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082196 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d10fd86-53ea-445e-bdf2-83e8caf22d02-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hwzjn\" (UID: \"8d10fd86-53ea-445e-bdf2-83e8caf22d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082220 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptpt7\" (UniqueName: \"kubernetes.io/projected/2e961f5c-6d04-462b-9097-346efdfe347c-kube-api-access-ptpt7\") pod \"downloads-7954f5f757-xgvpt\" (UID: \"2e961f5c-6d04-462b-9097-346efdfe347c\") " pod="openshift-console/downloads-7954f5f757-xgvpt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082242 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/108d1f0d-1354-4b1b-a3c3-fa5b14cee77f-serving-cert\") pod \"openshift-config-operator-7777fb866f-lfrsh\" (UID: \"108d1f0d-1354-4b1b-a3c3-fa5b14cee77f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082248 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0f7e7f85-336c-4706-ad26-f218ca626bed-tmpfs\") pod \"packageserver-d55dfcdfc-jbtzt\" (UID: \"0f7e7f85-336c-4706-ad26-f218ca626bed\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082302 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tczgs\" (UniqueName: \"kubernetes.io/projected/f377b1b3-e592-418c-af18-6cd9c169d9c5-kube-api-access-tczgs\") pod \"console-operator-58897d9998-4zll7\" (UID: \"f377b1b3-e592-418c-af18-6cd9c169d9c5\") " pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082334 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082354 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f377b1b3-e592-418c-af18-6cd9c169d9c5-trusted-ca\") pod \"console-operator-58897d9998-4zll7\" (UID: \"f377b1b3-e592-418c-af18-6cd9c169d9c5\") " pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082379 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46p58\" (UniqueName: \"kubernetes.io/projected/7cba096c-8054-4ad9-bb23-185f18482afb-kube-api-access-46p58\") pod \"multus-admission-controller-857f4d67dd-4crxp\" (UID: \"7cba096c-8054-4ad9-bb23-185f18482afb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082406 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082427 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7tc4\" (UniqueName: \"kubernetes.io/projected/c7b419e3-b339-4a82-8cb1-c14467712c1f-kube-api-access-j7tc4\") pod \"route-controller-manager-6576b87f9c-rm5rh\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082476 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcndk\" (UniqueName: \"kubernetes.io/projected/bb432ba7-089d-40a7-a0a7-43b3217a2527-kube-api-access-tcndk\") pod \"marketplace-operator-79b997595-d78cz\" (UID: \"bb432ba7-089d-40a7-a0a7-43b3217a2527\") " pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082508 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7j9d\" (UniqueName: \"kubernetes.io/projected/25b0c75e-6790-4622-80d9-d1182608eb38-kube-api-access-w7j9d\") pod \"package-server-manager-789f6589d5-l4jsz\" (UID: \"25b0c75e-6790-4622-80d9-d1182608eb38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082535 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c16d39c-7067-4368-b4f6-324d9612c4de-serving-cert\") pod \"service-ca-operator-777779d784-vng4p\" (UID: \"3c16d39c-7067-4368-b4f6-324d9612c4de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082563 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-audit-dir\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082588 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d10fd86-53ea-445e-bdf2-83e8caf22d02-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hwzjn\" (UID: \"8d10fd86-53ea-445e-bdf2-83e8caf22d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082652 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a18b874-8ef9-45af-9450-b33fd3c51efd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gmxft\" (UID: \"5a18b874-8ef9-45af-9450-b33fd3c51efd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082806 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7b419e3-b339-4a82-8cb1-c14467712c1f-config\") pod \"route-controller-manager-6576b87f9c-rm5rh\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.082841 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-audit-dir\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.083050 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-trusted-ca-bundle\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.083163 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.083757 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.084247 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-encryption-config\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.084308 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-service-ca\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.084448 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-serving-cert\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.084470 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bc77816-30a0-4d2c-8817-67fa89c39d35-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s9qmm\" (UID: \"9bc77816-30a0-4d2c-8817-67fa89c39d35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.084792 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-oauth-config\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.084903 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-etcd-client\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.086105 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a18b874-8ef9-45af-9450-b33fd3c51efd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gmxft\" (UID: \"5a18b874-8ef9-45af-9450-b33fd3c51efd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.086167 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.086513 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7b419e3-b339-4a82-8cb1-c14467712c1f-serving-cert\") pod \"route-controller-manager-6576b87f9c-rm5rh\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.087198 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-oauth-serving-cert\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.088247 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.088304 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.088317 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d10fd86-53ea-445e-bdf2-83e8caf22d02-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hwzjn\" (UID: \"8d10fd86-53ea-445e-bdf2-83e8caf22d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.088563 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.090628 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.106540 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.111090 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.113680 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.131157 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.154852 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.174125 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184155 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-etcd-client\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184185 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f7198988-705d-437d-9d57-787fda1d80c7-signing-cabundle\") pod \"service-ca-9c57cc56f-j54j2\" (UID: \"f7198988-705d-437d-9d57-787fda1d80c7\") " pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184204 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-secret-volume\") pod \"collect-profiles-29325750-kkskc\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184218 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5873894-8da0-464d-9a26-adad29928a59-metrics-tls\") pod \"dns-operator-744455d44c-pg8r8\" (UID: \"c5873894-8da0-464d-9a26-adad29928a59\") " pod="openshift-dns-operator/dns-operator-744455d44c-pg8r8" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184233 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-config\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184264 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/375981dd-1af1-4165-9ba8-ef13b77a7477-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gtc9j\" (UID: \"375981dd-1af1-4165-9ba8-ef13b77a7477\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184284 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59e47053-7257-4a23-81d0-c4965cde15bf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-prlpt\" (UID: \"59e47053-7257-4a23-81d0-c4965cde15bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184309 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d09e8523-afbc-4f5d-888d-92b350c15f7c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkldd\" (UID: \"d09e8523-afbc-4f5d-888d-92b350c15f7c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184328 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4kgq\" (UniqueName: \"kubernetes.io/projected/c5873894-8da0-464d-9a26-adad29928a59-kube-api-access-t4kgq\") pod \"dns-operator-744455d44c-pg8r8\" (UID: \"c5873894-8da0-464d-9a26-adad29928a59\") " pod="openshift-dns-operator/dns-operator-744455d44c-pg8r8" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184350 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb432ba7-089d-40a7-a0a7-43b3217a2527-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d78cz\" (UID: \"bb432ba7-089d-40a7-a0a7-43b3217a2527\") " pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184366 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqhx9\" (UniqueName: \"kubernetes.io/projected/f7198988-705d-437d-9d57-787fda1d80c7-kube-api-access-fqhx9\") pod \"service-ca-9c57cc56f-j54j2\" (UID: \"f7198988-705d-437d-9d57-787fda1d80c7\") " pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184388 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fa08c06-83f8-4256-b721-af9b30f9f915-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wpssm\" (UID: \"1fa08c06-83f8-4256-b721-af9b30f9f915\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184403 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/070aadf4-bcde-4da7-bbac-6937b7e6937f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vk6lb\" (UID: \"070aadf4-bcde-4da7-bbac-6937b7e6937f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184426 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7cba096c-8054-4ad9-bb23-185f18482afb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4crxp\" (UID: \"7cba096c-8054-4ad9-bb23-185f18482afb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184445 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb432ba7-089d-40a7-a0a7-43b3217a2527-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d78cz\" (UID: \"bb432ba7-089d-40a7-a0a7-43b3217a2527\") " pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184460 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0f7e7f85-336c-4706-ad26-f218ca626bed-webhook-cert\") pod \"packageserver-d55dfcdfc-jbtzt\" (UID: \"0f7e7f85-336c-4706-ad26-f218ca626bed\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184478 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6018b91d-752d-4b19-9121-705d34195d35-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mtqzr\" (UID: \"6018b91d-752d-4b19-9121-705d34195d35\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184500 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e47053-7257-4a23-81d0-c4965cde15bf-config\") pod \"kube-apiserver-operator-766d6c64bb-prlpt\" (UID: \"59e47053-7257-4a23-81d0-c4965cde15bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184513 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fa08c06-83f8-4256-b721-af9b30f9f915-proxy-tls\") pod \"machine-config-controller-84d6567774-wpssm\" (UID: \"1fa08c06-83f8-4256-b721-af9b30f9f915\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184528 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-serving-cert\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184543 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6018b91d-752d-4b19-9121-705d34195d35-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mtqzr\" (UID: \"6018b91d-752d-4b19-9121-705d34195d35\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184558 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e47053-7257-4a23-81d0-c4965cde15bf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-prlpt\" (UID: \"59e47053-7257-4a23-81d0-c4965cde15bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184572 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f7198988-705d-437d-9d57-787fda1d80c7-signing-key\") pod \"service-ca-9c57cc56f-j54j2\" (UID: \"f7198988-705d-437d-9d57-787fda1d80c7\") " pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184588 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmq6\" (UniqueName: \"kubernetes.io/projected/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-kube-api-access-vhmq6\") pod \"collect-profiles-29325750-kkskc\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184603 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375981dd-1af1-4165-9ba8-ef13b77a7477-config\") pod \"kube-controller-manager-operator-78b949d7b-gtc9j\" (UID: \"375981dd-1af1-4165-9ba8-ef13b77a7477\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184647 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0f7e7f85-336c-4706-ad26-f218ca626bed-tmpfs\") pod \"packageserver-d55dfcdfc-jbtzt\" (UID: \"0f7e7f85-336c-4706-ad26-f218ca626bed\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184665 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tczgs\" (UniqueName: \"kubernetes.io/projected/f377b1b3-e592-418c-af18-6cd9c169d9c5-kube-api-access-tczgs\") pod \"console-operator-58897d9998-4zll7\" (UID: \"f377b1b3-e592-418c-af18-6cd9c169d9c5\") " pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184682 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f377b1b3-e592-418c-af18-6cd9c169d9c5-trusted-ca\") pod \"console-operator-58897d9998-4zll7\" (UID: \"f377b1b3-e592-418c-af18-6cd9c169d9c5\") " pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184707 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46p58\" (UniqueName: \"kubernetes.io/projected/7cba096c-8054-4ad9-bb23-185f18482afb-kube-api-access-46p58\") pod \"multus-admission-controller-857f4d67dd-4crxp\" (UID: \"7cba096c-8054-4ad9-bb23-185f18482afb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184731 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7j9d\" (UniqueName: \"kubernetes.io/projected/25b0c75e-6790-4622-80d9-d1182608eb38-kube-api-access-w7j9d\") pod \"package-server-manager-789f6589d5-l4jsz\" (UID: \"25b0c75e-6790-4622-80d9-d1182608eb38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184748 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcndk\" (UniqueName: \"kubernetes.io/projected/bb432ba7-089d-40a7-a0a7-43b3217a2527-kube-api-access-tcndk\") pod \"marketplace-operator-79b997595-d78cz\" (UID: \"bb432ba7-089d-40a7-a0a7-43b3217a2527\") " pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184765 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c16d39c-7067-4368-b4f6-324d9612c4de-serving-cert\") pod \"service-ca-operator-777779d784-vng4p\" (UID: \"3c16d39c-7067-4368-b4f6-324d9612c4de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184783 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c16d39c-7067-4368-b4f6-324d9612c4de-config\") pod \"service-ca-operator-777779d784-vng4p\" (UID: \"3c16d39c-7067-4368-b4f6-324d9612c4de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184800 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-config-volume\") pod \"collect-profiles-29325750-kkskc\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184815 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/070aadf4-bcde-4da7-bbac-6937b7e6937f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vk6lb\" (UID: \"070aadf4-bcde-4da7-bbac-6937b7e6937f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184842 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7wxk\" (UniqueName: \"kubernetes.io/projected/d09e8523-afbc-4f5d-888d-92b350c15f7c-kube-api-access-m7wxk\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkldd\" (UID: \"d09e8523-afbc-4f5d-888d-92b350c15f7c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184862 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78rsj\" (UniqueName: \"kubernetes.io/projected/0f7e7f85-336c-4706-ad26-f218ca626bed-kube-api-access-78rsj\") pod \"packageserver-d55dfcdfc-jbtzt\" (UID: \"0f7e7f85-336c-4706-ad26-f218ca626bed\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184878 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f377b1b3-e592-418c-af18-6cd9c169d9c5-config\") pod \"console-operator-58897d9998-4zll7\" (UID: \"f377b1b3-e592-418c-af18-6cd9c169d9c5\") " pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184903 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-etcd-service-ca\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184920 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6px6v\" (UniqueName: \"kubernetes.io/projected/070aadf4-bcde-4da7-bbac-6937b7e6937f-kube-api-access-6px6v\") pod \"kube-storage-version-migrator-operator-b67b599dd-vk6lb\" (UID: \"070aadf4-bcde-4da7-bbac-6937b7e6937f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184936 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-etcd-ca\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184952 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/375981dd-1af1-4165-9ba8-ef13b77a7477-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gtc9j\" (UID: \"375981dd-1af1-4165-9ba8-ef13b77a7477\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184966 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp895\" (UniqueName: \"kubernetes.io/projected/1fa08c06-83f8-4256-b721-af9b30f9f915-kube-api-access-qp895\") pod \"machine-config-controller-84d6567774-wpssm\" (UID: \"1fa08c06-83f8-4256-b721-af9b30f9f915\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.184984 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4c72\" (UniqueName: \"kubernetes.io/projected/3c16d39c-7067-4368-b4f6-324d9612c4de-kube-api-access-w4c72\") pod \"service-ca-operator-777779d784-vng4p\" (UID: \"3c16d39c-7067-4368-b4f6-324d9612c4de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.185004 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzhwk\" (UniqueName: \"kubernetes.io/projected/2c097e98-163f-4ede-90a4-c9aa7318aaa0-kube-api-access-jzhwk\") pod \"olm-operator-6b444d44fb-qhj8g\" (UID: \"2c097e98-163f-4ede-90a4-c9aa7318aaa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.185018 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/25b0c75e-6790-4622-80d9-d1182608eb38-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-l4jsz\" (UID: \"25b0c75e-6790-4622-80d9-d1182608eb38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.185034 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2c097e98-163f-4ede-90a4-c9aa7318aaa0-srv-cert\") pod \"olm-operator-6b444d44fb-qhj8g\" (UID: \"2c097e98-163f-4ede-90a4-c9aa7318aaa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.185054 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjxlm\" (UniqueName: \"kubernetes.io/projected/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-kube-api-access-qjxlm\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.185077 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f377b1b3-e592-418c-af18-6cd9c169d9c5-serving-cert\") pod \"console-operator-58897d9998-4zll7\" (UID: \"f377b1b3-e592-418c-af18-6cd9c169d9c5\") " pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.185094 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6018b91d-752d-4b19-9121-705d34195d35-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mtqzr\" (UID: \"6018b91d-752d-4b19-9121-705d34195d35\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.185109 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56bx9\" (UniqueName: \"kubernetes.io/projected/3f005aad-20eb-493e-8b74-fb1cf25030aa-kube-api-access-56bx9\") pod \"migrator-59844c95c7-dp6lk\" (UID: \"3f005aad-20eb-493e-8b74-fb1cf25030aa\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dp6lk" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.185123 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0f7e7f85-336c-4706-ad26-f218ca626bed-apiservice-cert\") pod \"packageserver-d55dfcdfc-jbtzt\" (UID: \"0f7e7f85-336c-4706-ad26-f218ca626bed\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.185139 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2c097e98-163f-4ede-90a4-c9aa7318aaa0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qhj8g\" (UID: \"2c097e98-163f-4ede-90a4-c9aa7318aaa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.186307 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-config\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.187089 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fa08c06-83f8-4256-b721-af9b30f9f915-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wpssm\" (UID: \"1fa08c06-83f8-4256-b721-af9b30f9f915\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.188540 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f377b1b3-e592-418c-af18-6cd9c169d9c5-trusted-ca\") pod \"console-operator-58897d9998-4zll7\" (UID: \"f377b1b3-e592-418c-af18-6cd9c169d9c5\") " pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.188728 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-etcd-client\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.189149 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-etcd-ca\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.190372 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-serving-cert\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.190818 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0f7e7f85-336c-4706-ad26-f218ca626bed-tmpfs\") pod \"packageserver-d55dfcdfc-jbtzt\" (UID: \"0f7e7f85-336c-4706-ad26-f218ca626bed\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.190994 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.191414 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f377b1b3-e592-418c-af18-6cd9c169d9c5-serving-cert\") pod \"console-operator-58897d9998-4zll7\" (UID: \"f377b1b3-e592-418c-af18-6cd9c169d9c5\") " pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.191948 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f377b1b3-e592-418c-af18-6cd9c169d9c5-config\") pod \"console-operator-58897d9998-4zll7\" (UID: \"f377b1b3-e592-418c-af18-6cd9c169d9c5\") " pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.192282 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-etcd-service-ca\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.238019 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.238238 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.250584 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.273080 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.312826 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.314402 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.331047 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.350482 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.371001 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.382249 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-brgcm"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.384686 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b4x88"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.391048 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.411094 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.430502 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.449917 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.469918 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.481697 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/375981dd-1af1-4165-9ba8-ef13b77a7477-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gtc9j\" (UID: \"375981dd-1af1-4165-9ba8-ef13b77a7477\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.490381 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.492085 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375981dd-1af1-4165-9ba8-ef13b77a7477-config\") pod \"kube-controller-manager-operator-78b949d7b-gtc9j\" (UID: \"375981dd-1af1-4165-9ba8-ef13b77a7477\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.509684 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.535487 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.550750 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.559697 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8dqhm"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.561371 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nr92k"] Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.562158 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5873894-8da0-464d-9a26-adad29928a59-metrics-tls\") pod \"dns-operator-744455d44c-pg8r8\" (UID: \"c5873894-8da0-464d-9a26-adad29928a59\") " pod="openshift-dns-operator/dns-operator-744455d44c-pg8r8" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.570231 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: W1004 02:42:40.582709 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda04224f_997b_4890_b0c8_2bf983b1f21d.slice/crio-7be4ebfdf3a31707aa09ca0a18458bfd27ceded8e8d5fdebf1a2c8e9c7e5f378 WatchSource:0}: Error finding container 7be4ebfdf3a31707aa09ca0a18458bfd27ceded8e8d5fdebf1a2c8e9c7e5f378: Status 404 returned error can't find the container with id 7be4ebfdf3a31707aa09ca0a18458bfd27ceded8e8d5fdebf1a2c8e9c7e5f378 Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.610255 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.630898 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.641191 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fa08c06-83f8-4256-b721-af9b30f9f915-proxy-tls\") pod \"machine-config-controller-84d6567774-wpssm\" (UID: \"1fa08c06-83f8-4256-b721-af9b30f9f915\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.650352 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.670668 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.676337 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" event={"ID":"8c9da68e-d45d-44c9-be51-0b8a38042692","Type":"ContainerStarted","Data":"f331c17aaad8181b1ffa9d971501c51b9c2dcc143702bfe7629195eabc357c5d"} Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.677602 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" event={"ID":"1a40013f-a275-4839-8f62-09488dbf53a3","Type":"ContainerStarted","Data":"c677b1e7769cfa382e8247a832fae287012ee0d9db4e75cf25d0835b497f51fc"} Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.678588 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" event={"ID":"cc0ac95b-a7a9-4b23-a073-99146acc645d","Type":"ContainerStarted","Data":"161f0902029f1b305faeb973da43009379ee06b6775312caa3707a94c0a4d090"} Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.679464 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" event={"ID":"da04224f-997b-4890-b0c8-2bf983b1f21d","Type":"ContainerStarted","Data":"7be4ebfdf3a31707aa09ca0a18458bfd27ceded8e8d5fdebf1a2c8e9c7e5f378"} Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.691039 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.712019 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.718520 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2c097e98-163f-4ede-90a4-c9aa7318aaa0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qhj8g\" (UID: \"2c097e98-163f-4ede-90a4-c9aa7318aaa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.722029 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-secret-volume\") pod \"collect-profiles-29325750-kkskc\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.730330 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.735910 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e47053-7257-4a23-81d0-c4965cde15bf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-prlpt\" (UID: \"59e47053-7257-4a23-81d0-c4965cde15bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.750684 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.759011 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e47053-7257-4a23-81d0-c4965cde15bf-config\") pod \"kube-apiserver-operator-766d6c64bb-prlpt\" (UID: \"59e47053-7257-4a23-81d0-c4965cde15bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.771309 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.797864 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.811121 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.822754 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2c097e98-163f-4ede-90a4-c9aa7318aaa0-srv-cert\") pod \"olm-operator-6b444d44fb-qhj8g\" (UID: \"2c097e98-163f-4ede-90a4-c9aa7318aaa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.830361 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.849832 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.871258 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.890331 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.909552 4964 request.go:700] Waited for 1.006573183s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackageserver-service-cert&limit=500&resourceVersion=0 Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.911344 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.916065 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0f7e7f85-336c-4706-ad26-f218ca626bed-apiservice-cert\") pod \"packageserver-d55dfcdfc-jbtzt\" (UID: \"0f7e7f85-336c-4706-ad26-f218ca626bed\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.930368 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.931020 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6018b91d-752d-4b19-9121-705d34195d35-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mtqzr\" (UID: \"6018b91d-752d-4b19-9121-705d34195d35\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.950716 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.970368 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.975842 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6018b91d-752d-4b19-9121-705d34195d35-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mtqzr\" (UID: \"6018b91d-752d-4b19-9121-705d34195d35\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" Oct 04 02:42:40 crc kubenswrapper[4964]: I1004 02:42:40.990723 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.010741 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.024123 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/25b0c75e-6790-4622-80d9-d1182608eb38-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-l4jsz\" (UID: \"25b0c75e-6790-4622-80d9-d1182608eb38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.030528 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.050322 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.063907 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0f7e7f85-336c-4706-ad26-f218ca626bed-webhook-cert\") pod \"packageserver-d55dfcdfc-jbtzt\" (UID: \"0f7e7f85-336c-4706-ad26-f218ca626bed\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.070575 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.074644 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f7198988-705d-437d-9d57-787fda1d80c7-signing-key\") pod \"service-ca-9c57cc56f-j54j2\" (UID: \"f7198988-705d-437d-9d57-787fda1d80c7\") " pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.092128 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.112221 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.131868 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.158205 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.168425 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb432ba7-089d-40a7-a0a7-43b3217a2527-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d78cz\" (UID: \"bb432ba7-089d-40a7-a0a7-43b3217a2527\") " pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.170817 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.182813 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb432ba7-089d-40a7-a0a7-43b3217a2527-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d78cz\" (UID: \"bb432ba7-089d-40a7-a0a7-43b3217a2527\") " pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.186608 4964 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.186680 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d09e8523-afbc-4f5d-888d-92b350c15f7c-control-plane-machine-set-operator-tls podName:d09e8523-afbc-4f5d-888d-92b350c15f7c nodeName:}" failed. No retries permitted until 2025-10-04 02:42:41.686662056 +0000 UTC m=+141.583620704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/d09e8523-afbc-4f5d-888d-92b350c15f7c-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-lkldd" (UID: "d09e8523-afbc-4f5d-888d-92b350c15f7c") : failed to sync secret cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.188072 4964 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.188129 4964 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.188163 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070aadf4-bcde-4da7-bbac-6937b7e6937f-serving-cert podName:070aadf4-bcde-4da7-bbac-6937b7e6937f nodeName:}" failed. No retries permitted until 2025-10-04 02:42:41.688141904 +0000 UTC m=+141.585100582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/070aadf4-bcde-4da7-bbac-6937b7e6937f-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-vk6lb" (UID: "070aadf4-bcde-4da7-bbac-6937b7e6937f") : failed to sync secret cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.188307 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cba096c-8054-4ad9-bb23-185f18482afb-webhook-certs podName:7cba096c-8054-4ad9-bb23-185f18482afb nodeName:}" failed. No retries permitted until 2025-10-04 02:42:41.688274158 +0000 UTC m=+141.585232826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7cba096c-8054-4ad9-bb23-185f18482afb-webhook-certs") pod "multus-admission-controller-857f4d67dd-4crxp" (UID: "7cba096c-8054-4ad9-bb23-185f18482afb") : failed to sync secret cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.189173 4964 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.189222 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/070aadf4-bcde-4da7-bbac-6937b7e6937f-config podName:070aadf4-bcde-4da7-bbac-6937b7e6937f nodeName:}" failed. No retries permitted until 2025-10-04 02:42:41.689211308 +0000 UTC m=+141.586169956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/070aadf4-bcde-4da7-bbac-6937b7e6937f-config") pod "kube-storage-version-migrator-operator-b67b599dd-vk6lb" (UID: "070aadf4-bcde-4da7-bbac-6937b7e6937f") : failed to sync configmap cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.189225 4964 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.189252 4964 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.189280 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-config-volume podName:77f14c97-6ee6-4ff8-b98d-3c8cd595b994 nodeName:}" failed. No retries permitted until 2025-10-04 02:42:41.68927064 +0000 UTC m=+141.586229288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-config-volume") pod "collect-profiles-29325750-kkskc" (UID: "77f14c97-6ee6-4ff8-b98d-3c8cd595b994") : failed to sync configmap cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.189298 4964 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.189327 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c16d39c-7067-4368-b4f6-324d9612c4de-config podName:3c16d39c-7067-4368-b4f6-324d9612c4de nodeName:}" failed. No retries permitted until 2025-10-04 02:42:41.689298081 +0000 UTC m=+141.586256759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3c16d39c-7067-4368-b4f6-324d9612c4de-config") pod "service-ca-operator-777779d784-vng4p" (UID: "3c16d39c-7067-4368-b4f6-324d9612c4de") : failed to sync configmap cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: E1004 02:42:41.189356 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c16d39c-7067-4368-b4f6-324d9612c4de-serving-cert podName:3c16d39c-7067-4368-b4f6-324d9612c4de nodeName:}" failed. No retries permitted until 2025-10-04 02:42:41.689344062 +0000 UTC m=+141.586302730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3c16d39c-7067-4368-b4f6-324d9612c4de-serving-cert") pod "service-ca-operator-777779d784-vng4p" (UID: "3c16d39c-7067-4368-b4f6-324d9612c4de") : failed to sync secret cache: timed out waiting for the condition Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.190514 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.210044 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.231136 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.248536 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f7198988-705d-437d-9d57-787fda1d80c7-signing-cabundle\") pod \"service-ca-9c57cc56f-j54j2\" (UID: \"f7198988-705d-437d-9d57-787fda1d80c7\") " pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.250917 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.271025 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.291744 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.311836 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.331546 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.350730 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.413163 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.415208 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.415554 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.431169 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.451109 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.470689 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.490705 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.510051 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.530723 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.551180 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.571363 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.592968 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.612161 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.631078 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.650797 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.670704 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.683670 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" event={"ID":"1a40013f-a275-4839-8f62-09488dbf53a3","Type":"ContainerStarted","Data":"bcd1e87c0fdfe91a8f42e30a3c7bcf3073f046ee347ded05bac2492b401f319d"} Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.685817 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" event={"ID":"cc0ac95b-a7a9-4b23-a073-99146acc645d","Type":"ContainerStarted","Data":"fcd2578ef46a7f04cb2f41a1257d0927a62e277c6932dd8e3e7b898c9eed90b2"} Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.696793 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" event={"ID":"da04224f-997b-4890-b0c8-2bf983b1f21d","Type":"ContainerStarted","Data":"dde9250f8905fa22eb0e6a9888a34b5013ae5c645a8507bfe8521fa0c4309bdc"} Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.698706 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" event={"ID":"8c9da68e-d45d-44c9-be51-0b8a38042692","Type":"ContainerStarted","Data":"6d88e313a3d5a266356840e11b6813d14edc88e4073a46dbbce3136f823df7c7"} Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.711750 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.719647 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c16d39c-7067-4368-b4f6-324d9612c4de-serving-cert\") pod \"service-ca-operator-777779d784-vng4p\" (UID: \"3c16d39c-7067-4368-b4f6-324d9612c4de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.719680 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c16d39c-7067-4368-b4f6-324d9612c4de-config\") pod \"service-ca-operator-777779d784-vng4p\" (UID: \"3c16d39c-7067-4368-b4f6-324d9612c4de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.719700 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-config-volume\") pod \"collect-profiles-29325750-kkskc\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.719721 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/070aadf4-bcde-4da7-bbac-6937b7e6937f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vk6lb\" (UID: \"070aadf4-bcde-4da7-bbac-6937b7e6937f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.719947 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d09e8523-afbc-4f5d-888d-92b350c15f7c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkldd\" (UID: \"d09e8523-afbc-4f5d-888d-92b350c15f7c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.720011 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/070aadf4-bcde-4da7-bbac-6937b7e6937f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vk6lb\" (UID: \"070aadf4-bcde-4da7-bbac-6937b7e6937f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.720051 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7cba096c-8054-4ad9-bb23-185f18482afb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4crxp\" (UID: \"7cba096c-8054-4ad9-bb23-185f18482afb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.720558 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c16d39c-7067-4368-b4f6-324d9612c4de-config\") pod \"service-ca-operator-777779d784-vng4p\" (UID: \"3c16d39c-7067-4368-b4f6-324d9612c4de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.721871 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-config-volume\") pod \"collect-profiles-29325750-kkskc\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.722133 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/070aadf4-bcde-4da7-bbac-6937b7e6937f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vk6lb\" (UID: \"070aadf4-bcde-4da7-bbac-6937b7e6937f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.723294 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7cba096c-8054-4ad9-bb23-185f18482afb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4crxp\" (UID: \"7cba096c-8054-4ad9-bb23-185f18482afb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.724446 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d09e8523-afbc-4f5d-888d-92b350c15f7c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkldd\" (UID: \"d09e8523-afbc-4f5d-888d-92b350c15f7c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.727166 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c16d39c-7067-4368-b4f6-324d9612c4de-serving-cert\") pod \"service-ca-operator-777779d784-vng4p\" (UID: \"3c16d39c-7067-4368-b4f6-324d9612c4de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.727690 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/070aadf4-bcde-4da7-bbac-6937b7e6937f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vk6lb\" (UID: \"070aadf4-bcde-4da7-bbac-6937b7e6937f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.730828 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.751301 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.770773 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.791142 4964 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.811811 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.831600 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.850584 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.871422 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.915940 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h95zl\" (UniqueName: \"kubernetes.io/projected/108d1f0d-1354-4b1b-a3c3-fa5b14cee77f-kube-api-access-h95zl\") pod \"openshift-config-operator-7777fb866f-lfrsh\" (UID: \"108d1f0d-1354-4b1b-a3c3-fa5b14cee77f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.928801 4964 request.go:700] Waited for 1.852104474s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.939387 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8qxg\" (UniqueName: \"kubernetes.io/projected/d30aed64-b8f7-4028-8dfc-f3661ce1c459-kube-api-access-v8qxg\") pod \"console-f9d7485db-cznct\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.947787 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcg8f\" (UniqueName: \"kubernetes.io/projected/5a18b874-8ef9-45af-9450-b33fd3c51efd-kube-api-access-rcg8f\") pod \"openshift-apiserver-operator-796bbdcf4f-gmxft\" (UID: \"5a18b874-8ef9-45af-9450-b33fd3c51efd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.979497 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zn49\" (UniqueName: \"kubernetes.io/projected/c90c99ad-ac5c-4b40-8eba-e10fa92c1059-kube-api-access-2zn49\") pod \"apiserver-7bbb656c7d-ffdcs\" (UID: \"c90c99ad-ac5c-4b40-8eba-e10fa92c1059\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:41 crc kubenswrapper[4964]: I1004 02:42:41.987131 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkfr9\" (UniqueName: \"kubernetes.io/projected/b564efb8-50db-4d84-a456-4857001ab84a-kube-api-access-vkfr9\") pod \"oauth-openshift-558db77b4-2zdrf\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.009488 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffh5b\" (UniqueName: \"kubernetes.io/projected/8d10fd86-53ea-445e-bdf2-83e8caf22d02-kube-api-access-ffh5b\") pod \"cluster-image-registry-operator-dc59b4c8b-hwzjn\" (UID: \"8d10fd86-53ea-445e-bdf2-83e8caf22d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.023638 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.041517 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47nvs\" (UniqueName: \"kubernetes.io/projected/9bc77816-30a0-4d2c-8817-67fa89c39d35-kube-api-access-47nvs\") pod \"openshift-controller-manager-operator-756b6f6bc6-s9qmm\" (UID: \"9bc77816-30a0-4d2c-8817-67fa89c39d35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.052438 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.057723 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tztw\" (UniqueName: \"kubernetes.io/projected/e624ca3f-538e-443a-8f2e-aa64988c9ce4-kube-api-access-5tztw\") pod \"cluster-samples-operator-665b6dd947-br64d\" (UID: \"e624ca3f-538e-443a-8f2e-aa64988c9ce4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.062653 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.072924 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.085574 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d10fd86-53ea-445e-bdf2-83e8caf22d02-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hwzjn\" (UID: \"8d10fd86-53ea-445e-bdf2-83e8caf22d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.093493 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.100545 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.107463 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.114153 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.128183 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptpt7\" (UniqueName: \"kubernetes.io/projected/2e961f5c-6d04-462b-9097-346efdfe347c-kube-api-access-ptpt7\") pod \"downloads-7954f5f757-xgvpt\" (UID: \"2e961f5c-6d04-462b-9097-346efdfe347c\") " pod="openshift-console/downloads-7954f5f757-xgvpt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.139603 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7tc4\" (UniqueName: \"kubernetes.io/projected/c7b419e3-b339-4a82-8cb1-c14467712c1f-kube-api-access-j7tc4\") pod \"route-controller-manager-6576b87f9c-rm5rh\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.141407 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2lrl\" (UniqueName: \"kubernetes.io/projected/38588578-4883-453a-8a92-1bef1dc0f479-kube-api-access-t2lrl\") pod \"machine-approver-56656f9798-vb4qk\" (UID: \"38588578-4883-453a-8a92-1bef1dc0f479\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.158741 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tczgs\" (UniqueName: \"kubernetes.io/projected/f377b1b3-e592-418c-af18-6cd9c169d9c5-kube-api-access-tczgs\") pod \"console-operator-58897d9998-4zll7\" (UID: \"f377b1b3-e592-418c-af18-6cd9c169d9c5\") " pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.170551 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59e47053-7257-4a23-81d0-c4965cde15bf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-prlpt\" (UID: \"59e47053-7257-4a23-81d0-c4965cde15bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.188806 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.191165 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4kgq\" (UniqueName: \"kubernetes.io/projected/c5873894-8da0-464d-9a26-adad29928a59-kube-api-access-t4kgq\") pod \"dns-operator-744455d44c-pg8r8\" (UID: \"c5873894-8da0-464d-9a26-adad29928a59\") " pod="openshift-dns-operator/dns-operator-744455d44c-pg8r8" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.204589 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqhx9\" (UniqueName: \"kubernetes.io/projected/f7198988-705d-437d-9d57-787fda1d80c7-kube-api-access-fqhx9\") pod \"service-ca-9c57cc56f-j54j2\" (UID: \"f7198988-705d-437d-9d57-787fda1d80c7\") " pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.231580 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.233184 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6018b91d-752d-4b19-9121-705d34195d35-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mtqzr\" (UID: \"6018b91d-752d-4b19-9121-705d34195d35\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.253869 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6px6v\" (UniqueName: \"kubernetes.io/projected/070aadf4-bcde-4da7-bbac-6937b7e6937f-kube-api-access-6px6v\") pod \"kube-storage-version-migrator-operator-b67b599dd-vk6lb\" (UID: \"070aadf4-bcde-4da7-bbac-6937b7e6937f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.261045 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.268495 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46p58\" (UniqueName: \"kubernetes.io/projected/7cba096c-8054-4ad9-bb23-185f18482afb-kube-api-access-46p58\") pod \"multus-admission-controller-857f4d67dd-4crxp\" (UID: \"7cba096c-8054-4ad9-bb23-185f18482afb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.285780 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.289200 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7j9d\" (UniqueName: \"kubernetes.io/projected/25b0c75e-6790-4622-80d9-d1182608eb38-kube-api-access-w7j9d\") pod \"package-server-manager-789f6589d5-l4jsz\" (UID: \"25b0c75e-6790-4622-80d9-d1182608eb38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.307276 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcndk\" (UniqueName: \"kubernetes.io/projected/bb432ba7-089d-40a7-a0a7-43b3217a2527-kube-api-access-tcndk\") pod \"marketplace-operator-79b997595-d78cz\" (UID: \"bb432ba7-089d-40a7-a0a7-43b3217a2527\") " pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.325503 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7wxk\" (UniqueName: \"kubernetes.io/projected/d09e8523-afbc-4f5d-888d-92b350c15f7c-kube-api-access-m7wxk\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkldd\" (UID: \"d09e8523-afbc-4f5d-888d-92b350c15f7c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.346468 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78rsj\" (UniqueName: \"kubernetes.io/projected/0f7e7f85-336c-4706-ad26-f218ca626bed-kube-api-access-78rsj\") pod \"packageserver-d55dfcdfc-jbtzt\" (UID: \"0f7e7f85-336c-4706-ad26-f218ca626bed\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.366717 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/375981dd-1af1-4165-9ba8-ef13b77a7477-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gtc9j\" (UID: \"375981dd-1af1-4165-9ba8-ef13b77a7477\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.368632 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.374678 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.383162 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.389558 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xgvpt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.404593 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4c72\" (UniqueName: \"kubernetes.io/projected/3c16d39c-7067-4368-b4f6-324d9612c4de-kube-api-access-w4c72\") pod \"service-ca-operator-777779d784-vng4p\" (UID: \"3c16d39c-7067-4368-b4f6-324d9612c4de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.423875 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzhwk\" (UniqueName: \"kubernetes.io/projected/2c097e98-163f-4ede-90a4-c9aa7318aaa0-kube-api-access-jzhwk\") pod \"olm-operator-6b444d44fb-qhj8g\" (UID: \"2c097e98-163f-4ede-90a4-c9aa7318aaa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.426648 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.442011 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjxlm\" (UniqueName: \"kubernetes.io/projected/44b73f4c-9f6e-4957-9ab5-2fddcc30dc99-kube-api-access-qjxlm\") pod \"etcd-operator-b45778765-sjkbq\" (UID: \"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.449228 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp895\" (UniqueName: \"kubernetes.io/projected/1fa08c06-83f8-4256-b721-af9b30f9f915-kube-api-access-qp895\") pod \"machine-config-controller-84d6567774-wpssm\" (UID: \"1fa08c06-83f8-4256-b721-af9b30f9f915\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.458391 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.466992 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pg8r8" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.475432 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.497405 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.506966 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56bx9\" (UniqueName: \"kubernetes.io/projected/3f005aad-20eb-493e-8b74-fb1cf25030aa-kube-api-access-56bx9\") pod \"migrator-59844c95c7-dp6lk\" (UID: \"3f005aad-20eb-493e-8b74-fb1cf25030aa\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dp6lk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.517123 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.523575 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmq6\" (UniqueName: \"kubernetes.io/projected/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-kube-api-access-vhmq6\") pod \"collect-profiles-29325750-kkskc\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.531324 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533289 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d80374aa-c786-4e00-bb87-ac0998a61bc0-trusted-ca\") pod \"ingress-operator-5b745b69d9-gs4qw\" (UID: \"d80374aa-c786-4e00-bb87-ac0998a61bc0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533314 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p57bz\" (UniqueName: \"kubernetes.io/projected/bafaf93c-8a25-4b7e-8791-7d10b46ee161-kube-api-access-p57bz\") pod \"catalog-operator-68c6474976-t4p6t\" (UID: \"bafaf93c-8a25-4b7e-8791-7d10b46ee161\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533332 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xq9j\" (UniqueName: \"kubernetes.io/projected/d80374aa-c786-4e00-bb87-ac0998a61bc0-kube-api-access-8xq9j\") pod \"ingress-operator-5b745b69d9-gs4qw\" (UID: \"d80374aa-c786-4e00-bb87-ac0998a61bc0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533348 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d80374aa-c786-4e00-bb87-ac0998a61bc0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gs4qw\" (UID: \"d80374aa-c786-4e00-bb87-ac0998a61bc0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533384 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-default-certificate\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533400 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bafaf93c-8a25-4b7e-8791-7d10b46ee161-srv-cert\") pod \"catalog-operator-68c6474976-t4p6t\" (UID: \"bafaf93c-8a25-4b7e-8791-7d10b46ee161\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533417 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d80374aa-c786-4e00-bb87-ac0998a61bc0-metrics-tls\") pod \"ingress-operator-5b745b69d9-gs4qw\" (UID: \"d80374aa-c786-4e00-bb87-ac0998a61bc0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533432 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5ndh\" (UniqueName: \"kubernetes.io/projected/1d532dfb-e972-4fcb-a141-0c27f317505f-kube-api-access-s5ndh\") pod \"machine-config-operator-74547568cd-zrngt\" (UID: \"1d532dfb-e972-4fcb-a141-0c27f317505f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533710 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-metrics-certs\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533735 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krsvz\" (UniqueName: \"kubernetes.io/projected/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-kube-api-access-krsvz\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533801 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d532dfb-e972-4fcb-a141-0c27f317505f-proxy-tls\") pod \"machine-config-operator-74547568cd-zrngt\" (UID: \"1d532dfb-e972-4fcb-a141-0c27f317505f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533845 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bafaf93c-8a25-4b7e-8791-7d10b46ee161-profile-collector-cert\") pod \"catalog-operator-68c6474976-t4p6t\" (UID: \"bafaf93c-8a25-4b7e-8791-7d10b46ee161\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533870 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-stats-auth\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533890 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3756ce85-62bf-4b3b-94fc-ec155c12c913-cert\") pod \"ingress-canary-wrktt\" (UID: \"3756ce85-62bf-4b3b-94fc-ec155c12c913\") " pod="openshift-ingress-canary/ingress-canary-wrktt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533926 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15ca5de7-b5aa-4d86-82b5-122f22b494ee-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533945 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1d532dfb-e972-4fcb-a141-0c27f317505f-images\") pod \"machine-config-operator-74547568cd-zrngt\" (UID: \"1d532dfb-e972-4fcb-a141-0c27f317505f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.533968 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d532dfb-e972-4fcb-a141-0c27f317505f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zrngt\" (UID: \"1d532dfb-e972-4fcb-a141-0c27f317505f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.534024 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15ca5de7-b5aa-4d86-82b5-122f22b494ee-trusted-ca\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.534043 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssvxr\" (UniqueName: \"kubernetes.io/projected/3756ce85-62bf-4b3b-94fc-ec155c12c913-kube-api-access-ssvxr\") pod \"ingress-canary-wrktt\" (UID: \"3756ce85-62bf-4b3b-94fc-ec155c12c913\") " pod="openshift-ingress-canary/ingress-canary-wrktt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.534070 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-bound-sa-token\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.534085 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-service-ca-bundle\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.534114 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5vh7\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-kube-api-access-d5vh7\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.534133 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15ca5de7-b5aa-4d86-82b5-122f22b494ee-registry-certificates\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.534167 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.534186 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-registry-tls\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.534202 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15ca5de7-b5aa-4d86-82b5-122f22b494ee-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: E1004 02:42:42.536293 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.036266339 +0000 UTC m=+142.933224977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.538160 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.540630 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft"] Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.566080 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm"] Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.571552 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn"] Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.576036 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-cznct"] Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.579912 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.635151 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zdrf"] Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.635274 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d"] Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.636758 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.636980 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9e3ab37-3e49-432d-9c9e-de56eafc9591-config-volume\") pod \"dns-default-flq6t\" (UID: \"a9e3ab37-3e49-432d-9c9e-de56eafc9591\") " pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637037 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e01310a-918b-4577-90cd-3e85c149008f-certs\") pod \"machine-config-server-qtbhx\" (UID: \"8e01310a-918b-4577-90cd-3e85c149008f\") " pod="openshift-machine-config-operator/machine-config-server-qtbhx" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637095 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15ca5de7-b5aa-4d86-82b5-122f22b494ee-trusted-ca\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637119 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvxr\" (UniqueName: \"kubernetes.io/projected/3756ce85-62bf-4b3b-94fc-ec155c12c913-kube-api-access-ssvxr\") pod \"ingress-canary-wrktt\" (UID: \"3756ce85-62bf-4b3b-94fc-ec155c12c913\") " pod="openshift-ingress-canary/ingress-canary-wrktt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637140 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9bg7\" (UniqueName: \"kubernetes.io/projected/8e01310a-918b-4577-90cd-3e85c149008f-kube-api-access-s9bg7\") pod \"machine-config-server-qtbhx\" (UID: \"8e01310a-918b-4577-90cd-3e85c149008f\") " pod="openshift-machine-config-operator/machine-config-server-qtbhx" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637194 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-bound-sa-token\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637214 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-service-ca-bundle\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637243 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-plugins-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637303 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-csi-data-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637383 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5vh7\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-kube-api-access-d5vh7\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637494 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15ca5de7-b5aa-4d86-82b5-122f22b494ee-registry-certificates\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637511 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9e3ab37-3e49-432d-9c9e-de56eafc9591-metrics-tls\") pod \"dns-default-flq6t\" (UID: \"a9e3ab37-3e49-432d-9c9e-de56eafc9591\") " pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637537 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-registry-tls\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637561 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15ca5de7-b5aa-4d86-82b5-122f22b494ee-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637588 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d80374aa-c786-4e00-bb87-ac0998a61bc0-trusted-ca\") pod \"ingress-operator-5b745b69d9-gs4qw\" (UID: \"d80374aa-c786-4e00-bb87-ac0998a61bc0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637630 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-mountpoint-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637655 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p57bz\" (UniqueName: \"kubernetes.io/projected/bafaf93c-8a25-4b7e-8791-7d10b46ee161-kube-api-access-p57bz\") pod \"catalog-operator-68c6474976-t4p6t\" (UID: \"bafaf93c-8a25-4b7e-8791-7d10b46ee161\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637669 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-socket-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637687 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xq9j\" (UniqueName: \"kubernetes.io/projected/d80374aa-c786-4e00-bb87-ac0998a61bc0-kube-api-access-8xq9j\") pod \"ingress-operator-5b745b69d9-gs4qw\" (UID: \"d80374aa-c786-4e00-bb87-ac0998a61bc0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637719 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d80374aa-c786-4e00-bb87-ac0998a61bc0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gs4qw\" (UID: \"d80374aa-c786-4e00-bb87-ac0998a61bc0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637762 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-default-certificate\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637776 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bafaf93c-8a25-4b7e-8791-7d10b46ee161-srv-cert\") pod \"catalog-operator-68c6474976-t4p6t\" (UID: \"bafaf93c-8a25-4b7e-8791-7d10b46ee161\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.637792 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e01310a-918b-4577-90cd-3e85c149008f-node-bootstrap-token\") pod \"machine-config-server-qtbhx\" (UID: \"8e01310a-918b-4577-90cd-3e85c149008f\") " pod="openshift-machine-config-operator/machine-config-server-qtbhx" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638041 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d80374aa-c786-4e00-bb87-ac0998a61bc0-metrics-tls\") pod \"ingress-operator-5b745b69d9-gs4qw\" (UID: \"d80374aa-c786-4e00-bb87-ac0998a61bc0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638119 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5ndh\" (UniqueName: \"kubernetes.io/projected/1d532dfb-e972-4fcb-a141-0c27f317505f-kube-api-access-s5ndh\") pod \"machine-config-operator-74547568cd-zrngt\" (UID: \"1d532dfb-e972-4fcb-a141-0c27f317505f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638138 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4pm4\" (UniqueName: \"kubernetes.io/projected/a9e3ab37-3e49-432d-9c9e-de56eafc9591-kube-api-access-h4pm4\") pod \"dns-default-flq6t\" (UID: \"a9e3ab37-3e49-432d-9c9e-de56eafc9591\") " pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638218 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-metrics-certs\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638253 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krsvz\" (UniqueName: \"kubernetes.io/projected/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-kube-api-access-krsvz\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638289 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d532dfb-e972-4fcb-a141-0c27f317505f-proxy-tls\") pod \"machine-config-operator-74547568cd-zrngt\" (UID: \"1d532dfb-e972-4fcb-a141-0c27f317505f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638393 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bafaf93c-8a25-4b7e-8791-7d10b46ee161-profile-collector-cert\") pod \"catalog-operator-68c6474976-t4p6t\" (UID: \"bafaf93c-8a25-4b7e-8791-7d10b46ee161\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638409 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3756ce85-62bf-4b3b-94fc-ec155c12c913-cert\") pod \"ingress-canary-wrktt\" (UID: \"3756ce85-62bf-4b3b-94fc-ec155c12c913\") " pod="openshift-ingress-canary/ingress-canary-wrktt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638429 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-registration-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638467 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-stats-auth\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638483 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1d532dfb-e972-4fcb-a141-0c27f317505f-images\") pod \"machine-config-operator-74547568cd-zrngt\" (UID: \"1d532dfb-e972-4fcb-a141-0c27f317505f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638517 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg8f6\" (UniqueName: \"kubernetes.io/projected/31a61679-e5a4-444f-a77d-bd158e7a1dce-kube-api-access-qg8f6\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638553 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15ca5de7-b5aa-4d86-82b5-122f22b494ee-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.638569 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d532dfb-e972-4fcb-a141-0c27f317505f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zrngt\" (UID: \"1d532dfb-e972-4fcb-a141-0c27f317505f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:42 crc kubenswrapper[4964]: E1004 02:42:42.639024 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.139009018 +0000 UTC m=+143.035967656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.653356 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15ca5de7-b5aa-4d86-82b5-122f22b494ee-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.654272 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-service-ca-bundle\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.655486 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15ca5de7-b5aa-4d86-82b5-122f22b494ee-registry-certificates\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.658859 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15ca5de7-b5aa-4d86-82b5-122f22b494ee-trusted-ca\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.659666 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d80374aa-c786-4e00-bb87-ac0998a61bc0-trusted-ca\") pod \"ingress-operator-5b745b69d9-gs4qw\" (UID: \"d80374aa-c786-4e00-bb87-ac0998a61bc0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.665449 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d532dfb-e972-4fcb-a141-0c27f317505f-proxy-tls\") pod \"machine-config-operator-74547568cd-zrngt\" (UID: \"1d532dfb-e972-4fcb-a141-0c27f317505f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.665725 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3756ce85-62bf-4b3b-94fc-ec155c12c913-cert\") pod \"ingress-canary-wrktt\" (UID: \"3756ce85-62bf-4b3b-94fc-ec155c12c913\") " pod="openshift-ingress-canary/ingress-canary-wrktt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.670600 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.670824 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-registry-tls\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.671287 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bafaf93c-8a25-4b7e-8791-7d10b46ee161-profile-collector-cert\") pod \"catalog-operator-68c6474976-t4p6t\" (UID: \"bafaf93c-8a25-4b7e-8791-7d10b46ee161\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.672266 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d532dfb-e972-4fcb-a141-0c27f317505f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zrngt\" (UID: \"1d532dfb-e972-4fcb-a141-0c27f317505f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.672379 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1d532dfb-e972-4fcb-a141-0c27f317505f-images\") pod \"machine-config-operator-74547568cd-zrngt\" (UID: \"1d532dfb-e972-4fcb-a141-0c27f317505f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.673241 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-stats-auth\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.674214 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs"] Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.679384 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-default-certificate\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.679585 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d80374aa-c786-4e00-bb87-ac0998a61bc0-metrics-tls\") pod \"ingress-operator-5b745b69d9-gs4qw\" (UID: \"d80374aa-c786-4e00-bb87-ac0998a61bc0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.681788 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bafaf93c-8a25-4b7e-8791-7d10b46ee161-srv-cert\") pod \"catalog-operator-68c6474976-t4p6t\" (UID: \"bafaf93c-8a25-4b7e-8791-7d10b46ee161\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.682827 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15ca5de7-b5aa-4d86-82b5-122f22b494ee-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.684395 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-metrics-certs\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.693821 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssvxr\" (UniqueName: \"kubernetes.io/projected/3756ce85-62bf-4b3b-94fc-ec155c12c913-kube-api-access-ssvxr\") pod \"ingress-canary-wrktt\" (UID: \"3756ce85-62bf-4b3b-94fc-ec155c12c913\") " pod="openshift-ingress-canary/ingress-canary-wrktt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.702996 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.705647 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p57bz\" (UniqueName: \"kubernetes.io/projected/bafaf93c-8a25-4b7e-8791-7d10b46ee161-kube-api-access-p57bz\") pod \"catalog-operator-68c6474976-t4p6t\" (UID: \"bafaf93c-8a25-4b7e-8791-7d10b46ee161\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.706816 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wrktt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.718876 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" event={"ID":"8d10fd86-53ea-445e-bdf2-83e8caf22d02","Type":"ContainerStarted","Data":"e226dfbec58719c1793cb7729ee5269e904e179bf623c2cb2c66cc8a479eda9f"} Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.730819 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" event={"ID":"38588578-4883-453a-8a92-1bef1dc0f479","Type":"ContainerStarted","Data":"f9bd40a5a838d114049ab5fb2cb70bbe27cb61c472d5a986ab017c87aad7e374"} Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.731210 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.739902 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9e3ab37-3e49-432d-9c9e-de56eafc9591-config-volume\") pod \"dns-default-flq6t\" (UID: \"a9e3ab37-3e49-432d-9c9e-de56eafc9591\") " pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.739952 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e01310a-918b-4577-90cd-3e85c149008f-certs\") pod \"machine-config-server-qtbhx\" (UID: \"8e01310a-918b-4577-90cd-3e85c149008f\") " pod="openshift-machine-config-operator/machine-config-server-qtbhx" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.739986 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9bg7\" (UniqueName: \"kubernetes.io/projected/8e01310a-918b-4577-90cd-3e85c149008f-kube-api-access-s9bg7\") pod \"machine-config-server-qtbhx\" (UID: \"8e01310a-918b-4577-90cd-3e85c149008f\") " pod="openshift-machine-config-operator/machine-config-server-qtbhx" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.740019 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-plugins-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.740039 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-csi-data-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.740071 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.740093 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9e3ab37-3e49-432d-9c9e-de56eafc9591-metrics-tls\") pod \"dns-default-flq6t\" (UID: \"a9e3ab37-3e49-432d-9c9e-de56eafc9591\") " pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.740117 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-mountpoint-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.740142 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-socket-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.740184 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e01310a-918b-4577-90cd-3e85c149008f-node-bootstrap-token\") pod \"machine-config-server-qtbhx\" (UID: \"8e01310a-918b-4577-90cd-3e85c149008f\") " pod="openshift-machine-config-operator/machine-config-server-qtbhx" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.740206 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4pm4\" (UniqueName: \"kubernetes.io/projected/a9e3ab37-3e49-432d-9c9e-de56eafc9591-kube-api-access-h4pm4\") pod \"dns-default-flq6t\" (UID: \"a9e3ab37-3e49-432d-9c9e-de56eafc9591\") " pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.740264 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-registration-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.740283 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg8f6\" (UniqueName: \"kubernetes.io/projected/31a61679-e5a4-444f-a77d-bd158e7a1dce-kube-api-access-qg8f6\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.741819 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9e3ab37-3e49-432d-9c9e-de56eafc9591-config-volume\") pod \"dns-default-flq6t\" (UID: \"a9e3ab37-3e49-432d-9c9e-de56eafc9591\") " pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.741899 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-mountpoint-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.742160 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-socket-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.744031 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-plugins-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.744497 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-registration-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.745208 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9e3ab37-3e49-432d-9c9e-de56eafc9591-metrics-tls\") pod \"dns-default-flq6t\" (UID: \"a9e3ab37-3e49-432d-9c9e-de56eafc9591\") " pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.748034 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e01310a-918b-4577-90cd-3e85c149008f-node-bootstrap-token\") pod \"machine-config-server-qtbhx\" (UID: \"8e01310a-918b-4577-90cd-3e85c149008f\") " pod="openshift-machine-config-operator/machine-config-server-qtbhx" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.749390 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xq9j\" (UniqueName: \"kubernetes.io/projected/d80374aa-c786-4e00-bb87-ac0998a61bc0-kube-api-access-8xq9j\") pod \"ingress-operator-5b745b69d9-gs4qw\" (UID: \"d80374aa-c786-4e00-bb87-ac0998a61bc0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.749690 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5vh7\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-kube-api-access-d5vh7\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.749773 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e01310a-918b-4577-90cd-3e85c149008f-certs\") pod \"machine-config-server-qtbhx\" (UID: \"8e01310a-918b-4577-90cd-3e85c149008f\") " pod="openshift-machine-config-operator/machine-config-server-qtbhx" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.776448 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/31a61679-e5a4-444f-a77d-bd158e7a1dce-csi-data-dir\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: E1004 02:42:42.776504 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.276488108 +0000 UTC m=+143.173446746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.803216 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dp6lk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.811106 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.813008 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh"] Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.813997 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-j54j2"] Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.815652 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5ndh\" (UniqueName: \"kubernetes.io/projected/1d532dfb-e972-4fcb-a141-0c27f317505f-kube-api-access-s5ndh\") pod \"machine-config-operator-74547568cd-zrngt\" (UID: \"1d532dfb-e972-4fcb-a141-0c27f317505f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.815830 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt"] Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.817119 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh"] Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.820212 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" event={"ID":"cc0ac95b-a7a9-4b23-a073-99146acc645d","Type":"ContainerStarted","Data":"249210a00f50f0f6cf24d523c5eb5addde410f7f2652b8a169de8abd4a9c215d"} Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.821891 4964 generic.go:334] "Generic (PLEG): container finished" podID="8c9da68e-d45d-44c9-be51-0b8a38042692" containerID="6d88e313a3d5a266356840e11b6813d14edc88e4073a46dbbce3136f823df7c7" exitCode=0 Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.821967 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" event={"ID":"8c9da68e-d45d-44c9-be51-0b8a38042692","Type":"ContainerDied","Data":"6d88e313a3d5a266356840e11b6813d14edc88e4073a46dbbce3136f823df7c7"} Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.823339 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" event={"ID":"5a18b874-8ef9-45af-9450-b33fd3c51efd","Type":"ContainerStarted","Data":"3f5124a4bd94ba6d3d00a041cf79cb548d823ceae4d27d44e2f5baa8f1e68903"} Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.823540 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.826530 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-bound-sa-token\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.828023 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krsvz\" (UniqueName: \"kubernetes.io/projected/c1430fe6-c0d6-4356-8aa3-b3c06f738c2f-kube-api-access-krsvz\") pod \"router-default-5444994796-khlb2\" (UID: \"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f\") " pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.829754 4964 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nr92k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.829794 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" podUID="da04224f-997b-4890-b0c8-2bf983b1f21d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.834312 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d80374aa-c786-4e00-bb87-ac0998a61bc0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gs4qw\" (UID: \"d80374aa-c786-4e00-bb87-ac0998a61bc0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.841875 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:42 crc kubenswrapper[4964]: E1004 02:42:42.842032 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.342009934 +0000 UTC m=+143.238968572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.842253 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:42 crc kubenswrapper[4964]: E1004 02:42:42.842901 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.342880643 +0000 UTC m=+143.239839281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.869530 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4pm4\" (UniqueName: \"kubernetes.io/projected/a9e3ab37-3e49-432d-9c9e-de56eafc9591-kube-api-access-h4pm4\") pod \"dns-default-flq6t\" (UID: \"a9e3ab37-3e49-432d-9c9e-de56eafc9591\") " pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.903498 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9bg7\" (UniqueName: \"kubernetes.io/projected/8e01310a-918b-4577-90cd-3e85c149008f-kube-api-access-s9bg7\") pod \"machine-config-server-qtbhx\" (UID: \"8e01310a-918b-4577-90cd-3e85c149008f\") " pod="openshift-machine-config-operator/machine-config-server-qtbhx" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.918361 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg8f6\" (UniqueName: \"kubernetes.io/projected/31a61679-e5a4-444f-a77d-bd158e7a1dce-kube-api-access-qg8f6\") pod \"csi-hostpathplugin-rlnbj\" (UID: \"31a61679-e5a4-444f-a77d-bd158e7a1dce\") " pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.942964 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:42 crc kubenswrapper[4964]: E1004 02:42:42.943365 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.443331698 +0000 UTC m=+143.340290336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:42 crc kubenswrapper[4964]: I1004 02:42:42.991952 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.017079 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qtbhx" Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.040063 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.044643 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.045658 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.045991 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.545979023 +0000 UTC m=+143.442937651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.048399 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.051270 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" Oct 04 02:42:43 crc kubenswrapper[4964]: W1004 02:42:43.052144 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7b419e3_b339_4a82_8cb1_c14467712c1f.slice/crio-5ca451b1c2e525e4d9fa4481655f762f04b3a6334e4e70a8fce9a60fd3f5e550 WatchSource:0}: Error finding container 5ca451b1c2e525e4d9fa4481655f762f04b3a6334e4e70a8fce9a60fd3f5e550: Status 404 returned error can't find the container with id 5ca451b1c2e525e4d9fa4481655f762f04b3a6334e4e70a8fce9a60fd3f5e550 Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.148430 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.149481 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.649452605 +0000 UTC m=+143.546411253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.155585 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.156930 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.656911356 +0000 UTC m=+143.553869994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.263429 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.264342 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.764321395 +0000 UTC m=+143.661280043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.267899 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.284711 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pg8r8"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.288154 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.288180 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.365416 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.365777 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.865762112 +0000 UTC m=+143.762720750 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.466659 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.467502 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.967452817 +0000 UTC m=+143.864411455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.467660 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.468041 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:43.968027705 +0000 UTC m=+143.864986343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.514217 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xgvpt"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.521877 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.525602 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4crxp"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.528183 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4zll7"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.538430 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.542966 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g"] Oct 04 02:42:43 crc kubenswrapper[4964]: W1004 02:42:43.566182 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e01310a_918b_4577_90cd_3e85c149008f.slice/crio-85c622bb72269c7cd84adfe4a60516e88b714bb4e72011cacba937cbfde8a034 WatchSource:0}: Error finding container 85c622bb72269c7cd84adfe4a60516e88b714bb4e72011cacba937cbfde8a034: Status 404 returned error can't find the container with id 85c622bb72269c7cd84adfe4a60516e88b714bb4e72011cacba937cbfde8a034 Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.569113 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.570579 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:44.070556826 +0000 UTC m=+143.967515464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.604306 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" podStartSLOduration=122.604289906 podStartE2EDuration="2m2.604289906s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:43.602387665 +0000 UTC m=+143.499346303" watchObservedRunningTime="2025-10-04 02:42:43.604289906 +0000 UTC m=+143.501248534" Oct 04 02:42:43 crc kubenswrapper[4964]: W1004 02:42:43.656191 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf377b1b3_e592_418c_af18_6cd9c169d9c5.slice/crio-fd9b9d03d831be7c9e1085bb744d2c51517f6b7c823bb932155cbd62639a733c WatchSource:0}: Error finding container fd9b9d03d831be7c9e1085bb744d2c51517f6b7c823bb932155cbd62639a733c: Status 404 returned error can't find the container with id fd9b9d03d831be7c9e1085bb744d2c51517f6b7c823bb932155cbd62639a733c Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.656262 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.656296 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vng4p"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.658510 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.664233 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.668737 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.671368 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.671800 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:44.171781737 +0000 UTC m=+144.068740375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.686745 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wrktt"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.698270 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d78cz"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.712555 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sjkbq"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.735831 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt"] Oct 04 02:42:43 crc kubenswrapper[4964]: W1004 02:42:43.751448 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd80374aa_c786_4e00_bb87_ac0998a61bc0.slice/crio-e062cbb23b6100ce8d6a4a95392dac8a96a4958da862aa1515ced77e93490c77 WatchSource:0}: Error finding container e062cbb23b6100ce8d6a4a95392dac8a96a4958da862aa1515ced77e93490c77: Status 404 returned error can't find the container with id e062cbb23b6100ce8d6a4a95392dac8a96a4958da862aa1515ced77e93490c77 Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.775665 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.775977 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:44.275962632 +0000 UTC m=+144.172921270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:43 crc kubenswrapper[4964]: W1004 02:42:43.780417 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44b73f4c_9f6e_4957_9ab5_2fddcc30dc99.slice/crio-079254ca25724819065f9dfe20f51d1c728bfffb688a11e84492bf2b2f9aa3cd WatchSource:0}: Error finding container 079254ca25724819065f9dfe20f51d1c728bfffb688a11e84492bf2b2f9aa3cd: Status 404 returned error can't find the container with id 079254ca25724819065f9dfe20f51d1c728bfffb688a11e84492bf2b2f9aa3cd Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.795473 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dp6lk"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.797802 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.805829 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rlnbj"] Oct 04 02:42:43 crc kubenswrapper[4964]: W1004 02:42:43.813850 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3756ce85_62bf_4b3b_94fc_ec155c12c913.slice/crio-b2a18adfd08191e2b69f5960320f2ae41c77bf9cdab22ccc0baf09f063d59ff3 WatchSource:0}: Error finding container b2a18adfd08191e2b69f5960320f2ae41c77bf9cdab22ccc0baf09f063d59ff3: Status 404 returned error can't find the container with id b2a18adfd08191e2b69f5960320f2ae41c77bf9cdab22ccc0baf09f063d59ff3 Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.829910 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" event={"ID":"2c097e98-163f-4ede-90a4-c9aa7318aaa0","Type":"ContainerStarted","Data":"0645e3c6e7e3631832ed010f73cf818479f9db37f09913ac62d3212d976396c1"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.831586 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pg8r8" event={"ID":"c5873894-8da0-464d-9a26-adad29928a59","Type":"ContainerStarted","Data":"fe3e65aa495362288455183d1a8e083ec9031be5874078cbaa38096cda0f0062"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.832736 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" event={"ID":"f7198988-705d-437d-9d57-787fda1d80c7","Type":"ContainerStarted","Data":"006020bdadb611cbaa9d1bceed2874416263b92e0bcc91364fecc8db0ad33c25"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.841349 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" event={"ID":"375981dd-1af1-4165-9ba8-ef13b77a7477","Type":"ContainerStarted","Data":"5faea58a2dc1110700eeb4097ede0afd94edad0cd3fe820b911949a58948aee8"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.842360 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-khlb2" event={"ID":"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f","Type":"ContainerStarted","Data":"4d6edaa6301e4b29cb5b0e033c3dc383f2afc1c0e01ef77281e3a9fa160be241"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.845588 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4zll7" event={"ID":"f377b1b3-e592-418c-af18-6cd9c169d9c5","Type":"ContainerStarted","Data":"fd9b9d03d831be7c9e1085bb744d2c51517f6b7c823bb932155cbd62639a733c"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.848101 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" event={"ID":"5a18b874-8ef9-45af-9450-b33fd3c51efd","Type":"ContainerStarted","Data":"bc7c70ce39e6d86594cf92bdf1ad179636de618c359cd3a8e22eb6082a00756c"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.853204 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" event={"ID":"8d10fd86-53ea-445e-bdf2-83e8caf22d02","Type":"ContainerStarted","Data":"e9c21596e047852258f8c173edc00bb86698117f8b56bdbb6b4b24e550cd13ad"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.855266 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" event={"ID":"bb432ba7-089d-40a7-a0a7-43b3217a2527","Type":"ContainerStarted","Data":"f4f9bfc3728c5f57a6557e04ae3ca020186103a5f7c8c54f61b2b433dcd98d7a"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.856975 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" event={"ID":"1fa08c06-83f8-4256-b721-af9b30f9f915","Type":"ContainerStarted","Data":"741eed10dd1e688be01c3c372134b87faf1d55807adafe238fd90ef852c771f3"} Oct 04 02:42:43 crc kubenswrapper[4964]: W1004 02:42:43.858450 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25b0c75e_6790_4622_80d9_d1182608eb38.slice/crio-bfa5fe449732d3b685206d02edc97bacfaf59f2ac47dbd6d10c7f1acd1f6d8fa WatchSource:0}: Error finding container bfa5fe449732d3b685206d02edc97bacfaf59f2ac47dbd6d10c7f1acd1f6d8fa: Status 404 returned error can't find the container with id bfa5fe449732d3b685206d02edc97bacfaf59f2ac47dbd6d10c7f1acd1f6d8fa Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.859728 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" event={"ID":"b564efb8-50db-4d84-a456-4857001ab84a","Type":"ContainerStarted","Data":"73bba6d1ee40bfa1ae61dfcdefd3905505ef0751c21fbff26be98b49f585eebb"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.860838 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" event={"ID":"6018b91d-752d-4b19-9121-705d34195d35","Type":"ContainerStarted","Data":"ba1401d77ff2c158e85eb2e84cce2aeaf864b729b3cd362a80c5d6ddd723cbe0"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.861594 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" event={"ID":"0f7e7f85-336c-4706-ad26-f218ca626bed","Type":"ContainerStarted","Data":"e39a89a53c4edfb82b178f6886d7cc61de878daf8a0ea787a4b789dfb12e0521"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.869576 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d" event={"ID":"e624ca3f-538e-443a-8f2e-aa64988c9ce4","Type":"ContainerStarted","Data":"7dd8a3f7079e69a80895bc3162795cab3164694cfe07c992f1d5e5b312b3de05"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.870847 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd" event={"ID":"d09e8523-afbc-4f5d-888d-92b350c15f7c","Type":"ContainerStarted","Data":"62c64b96b530985e28fe4893503d78a98ab1af28ef3095fcd6ac74f6056f1671"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.872313 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qtbhx" event={"ID":"8e01310a-918b-4577-90cd-3e85c149008f","Type":"ContainerStarted","Data":"85c622bb72269c7cd84adfe4a60516e88b714bb4e72011cacba937cbfde8a034"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.874201 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" event={"ID":"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99","Type":"ContainerStarted","Data":"079254ca25724819065f9dfe20f51d1c728bfffb688a11e84492bf2b2f9aa3cd"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.876328 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" event={"ID":"108d1f0d-1354-4b1b-a3c3-fa5b14cee77f","Type":"ContainerStarted","Data":"ac631f9c914d85acc4981959c1749695049d7eaed8514fe6b7c589c08e607e6d"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.879050 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.881474 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:44.381461499 +0000 UTC m=+144.278420127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.886412 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-flq6t"] Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.889408 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" event={"ID":"38588578-4883-453a-8a92-1bef1dc0f479","Type":"ContainerStarted","Data":"5765dfb1383e09bdd4f9b5bdb1e55ddf24f555acb13fcae7680404904e4b8ee6"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.890788 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" event={"ID":"d80374aa-c786-4e00-bb87-ac0998a61bc0","Type":"ContainerStarted","Data":"e062cbb23b6100ce8d6a4a95392dac8a96a4958da862aa1515ced77e93490c77"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.897314 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" event={"ID":"77f14c97-6ee6-4ff8-b98d-3c8cd595b994","Type":"ContainerStarted","Data":"b56918fe5c57378f118da363c3fecf1b2764b53e90b4c4a691a3a6ebcaa11043"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.903286 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" event={"ID":"9bc77816-30a0-4d2c-8817-67fa89c39d35","Type":"ContainerStarted","Data":"5e61e95009eeebe8aeb85f48a7ecd3e7303427f205f5c10613a133bbf4661f56"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.903325 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" event={"ID":"9bc77816-30a0-4d2c-8817-67fa89c39d35","Type":"ContainerStarted","Data":"a22952e9394e6c1f873b884d1d514a2de4909316110ee68d2f67c00a74ce9c68"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.905600 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xgvpt" event={"ID":"2e961f5c-6d04-462b-9097-346efdfe347c","Type":"ContainerStarted","Data":"5876b9d8d0fc0eb7ab1dd938ed458b0be08f26adeb2abd02101ab873d1249c4a"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.909180 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" event={"ID":"070aadf4-bcde-4da7-bbac-6937b7e6937f","Type":"ContainerStarted","Data":"30439c7a4ae16f46b73809754b13f0d0e08ebc0ad07667b998305c4dfaa49b30"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.911944 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cznct" event={"ID":"d30aed64-b8f7-4028-8dfc-f3661ce1c459","Type":"ContainerStarted","Data":"7e118f204c2cbb7288943f024e0d8747d0887d595d7625c47a9b2e6a1c6bb755"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.912838 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" event={"ID":"7cba096c-8054-4ad9-bb23-185f18482afb","Type":"ContainerStarted","Data":"e88e2d4d8dee475db9a75e872a409f65d0a8f2f3fce11a0f5b03c093218187a3"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.915375 4964 generic.go:334] "Generic (PLEG): container finished" podID="c90c99ad-ac5c-4b40-8eba-e10fa92c1059" containerID="5af2aa068f33c65627fe2abdb6d02baf2ec89386bc765066432415b4d768a66b" exitCode=0 Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.915413 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" event={"ID":"c90c99ad-ac5c-4b40-8eba-e10fa92c1059","Type":"ContainerDied","Data":"5af2aa068f33c65627fe2abdb6d02baf2ec89386bc765066432415b4d768a66b"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.915429 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" event={"ID":"c90c99ad-ac5c-4b40-8eba-e10fa92c1059","Type":"ContainerStarted","Data":"51574973ac2e015dbca704da10b5b256855d00af423e629da15d80f24464cac7"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.916764 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" event={"ID":"c7b419e3-b339-4a82-8cb1-c14467712c1f","Type":"ContainerStarted","Data":"5ca451b1c2e525e4d9fa4481655f762f04b3a6334e4e70a8fce9a60fd3f5e550"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.921524 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" event={"ID":"59e47053-7257-4a23-81d0-c4965cde15bf","Type":"ContainerStarted","Data":"badeee23e6b73acd2dae9b271d24b45473993d8eaf182f25854e5600e5f864f0"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.924059 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wrktt" event={"ID":"3756ce85-62bf-4b3b-94fc-ec155c12c913","Type":"ContainerStarted","Data":"b2a18adfd08191e2b69f5960320f2ae41c77bf9cdab22ccc0baf09f063d59ff3"} Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.925197 4964 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nr92k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.925236 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" podUID="da04224f-997b-4890-b0c8-2bf983b1f21d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 04 02:42:43 crc kubenswrapper[4964]: W1004 02:42:43.947581 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9e3ab37_3e49_432d_9c9e_de56eafc9591.slice/crio-7e4145f44499c230841fc2d8dfa0c3d6885dbbf296fc017fbb0a62ddc2841067 WatchSource:0}: Error finding container 7e4145f44499c230841fc2d8dfa0c3d6885dbbf296fc017fbb0a62ddc2841067: Status 404 returned error can't find the container with id 7e4145f44499c230841fc2d8dfa0c3d6885dbbf296fc017fbb0a62ddc2841067 Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.979745 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.979971 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:44.47991334 +0000 UTC m=+144.376871978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:43 crc kubenswrapper[4964]: I1004 02:42:43.982325 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:43 crc kubenswrapper[4964]: E1004 02:42:43.983983 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:44.48394908 +0000 UTC m=+144.380907718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.085679 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:44 crc kubenswrapper[4964]: E1004 02:42:44.086259 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:44.586240614 +0000 UTC m=+144.483199252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.187456 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:44 crc kubenswrapper[4964]: E1004 02:42:44.187838 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:44.687822725 +0000 UTC m=+144.584781373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.249041 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-brgcm" podStartSLOduration=123.249026182 podStartE2EDuration="2m3.249026182s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:44.203933756 +0000 UTC m=+144.100892404" watchObservedRunningTime="2025-10-04 02:42:44.249026182 +0000 UTC m=+144.145984820" Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.290812 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:44 crc kubenswrapper[4964]: E1004 02:42:44.290972 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:44.790932656 +0000 UTC m=+144.687891294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.291068 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:44 crc kubenswrapper[4964]: E1004 02:42:44.291408 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:44.791395971 +0000 UTC m=+144.688354609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.392327 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:44 crc kubenswrapper[4964]: E1004 02:42:44.392989 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:44.892971701 +0000 UTC m=+144.789930339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.494345 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:44 crc kubenswrapper[4964]: E1004 02:42:44.494928 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:44.994916354 +0000 UTC m=+144.891874992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.524702 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gmxft" podStartSLOduration=123.524683576 podStartE2EDuration="2m3.524683576s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:44.524398506 +0000 UTC m=+144.421357144" watchObservedRunningTime="2025-10-04 02:42:44.524683576 +0000 UTC m=+144.421642214" Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.598454 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:44 crc kubenswrapper[4964]: E1004 02:42:44.598674 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.098640514 +0000 UTC m=+144.995599152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.598835 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:44 crc kubenswrapper[4964]: E1004 02:42:44.599172 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.099157121 +0000 UTC m=+144.996115759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.701167 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:44 crc kubenswrapper[4964]: E1004 02:42:44.701609 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.20159224 +0000 UTC m=+145.098550878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.703409 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-b4x88" podStartSLOduration=123.703394508 podStartE2EDuration="2m3.703394508s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:44.6455682 +0000 UTC m=+144.542526838" watchObservedRunningTime="2025-10-04 02:42:44.703394508 +0000 UTC m=+144.600353146" Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.747116 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s9qmm" podStartSLOduration=123.747097989 podStartE2EDuration="2m3.747097989s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:44.705742434 +0000 UTC m=+144.602701072" watchObservedRunningTime="2025-10-04 02:42:44.747097989 +0000 UTC m=+144.644056627" Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.792251 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwzjn" podStartSLOduration=123.792204347 podStartE2EDuration="2m3.792204347s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:44.7848888 +0000 UTC m=+144.681847448" watchObservedRunningTime="2025-10-04 02:42:44.792204347 +0000 UTC m=+144.689162985" Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.804239 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:44 crc kubenswrapper[4964]: E1004 02:42:44.804804 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.304760072 +0000 UTC m=+145.201718710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.905194 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:44 crc kubenswrapper[4964]: E1004 02:42:44.905385 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.405357262 +0000 UTC m=+145.302315890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.905438 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:44 crc kubenswrapper[4964]: E1004 02:42:44.905886 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.405871838 +0000 UTC m=+145.302830486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.933285 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" event={"ID":"bafaf93c-8a25-4b7e-8791-7d10b46ee161","Type":"ContainerStarted","Data":"aff4f6632c378a17d0568eba40a8d4d83acb12dc4299e14322468f6b64b67584"} Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.933353 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" event={"ID":"bafaf93c-8a25-4b7e-8791-7d10b46ee161","Type":"ContainerStarted","Data":"6511f8c66c8cefea066fc58049930debb9377297599ad52561763392c8f0b7e7"} Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.934353 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.936436 4964 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-t4p6t container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.936477 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" podUID="bafaf93c-8a25-4b7e-8791-7d10b46ee161" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.938467 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" event={"ID":"77f14c97-6ee6-4ff8-b98d-3c8cd595b994","Type":"ContainerStarted","Data":"a06cc299f9052b11b553a1e032f51ab5b8b257d86e053b9d70fe6eec68471cdb"} Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.958314 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" event={"ID":"2c097e98-163f-4ede-90a4-c9aa7318aaa0","Type":"ContainerStarted","Data":"79813c530875c97f96b758ab55a3c71e9fcb27d3f6739fa0dd0db122984f586f"} Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.959140 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.960422 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" event={"ID":"f7198988-705d-437d-9d57-787fda1d80c7","Type":"ContainerStarted","Data":"2e1b5a75ee5c75e6ec8f5d5108ec52d61bd9b348ab127c6d575ceed796eaf550"} Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.962936 4964 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qhj8g container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.962967 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" podUID="2c097e98-163f-4ede-90a4-c9aa7318aaa0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.968129 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" event={"ID":"bb432ba7-089d-40a7-a0a7-43b3217a2527","Type":"ContainerStarted","Data":"e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c"} Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.968787 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.972429 4964 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-d78cz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.972474 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" podUID="bb432ba7-089d-40a7-a0a7-43b3217a2527" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 04 02:42:44 crc kubenswrapper[4964]: I1004 02:42:44.985076 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" event={"ID":"44b73f4c-9f6e-4957-9ab5-2fddcc30dc99","Type":"ContainerStarted","Data":"c2ea2c600a3f44fac9c2c98ac43cc85b248f503c146b1d72b1b48065b7e7c7c6"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:44.999986 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" event={"ID":"375981dd-1af1-4165-9ba8-ef13b77a7477","Type":"ContainerStarted","Data":"9d429550cedd9b3626e7efb355e2bd9a0da1edda516c3431752cea0bea9c27fb"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.006187 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.007827 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.50780629 +0000 UTC m=+145.404764928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.027399 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" podStartSLOduration=124.027382092 podStartE2EDuration="2m4.027382092s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.02603861 +0000 UTC m=+144.922997258" watchObservedRunningTime="2025-10-04 02:42:45.027382092 +0000 UTC m=+144.924340730" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.028263 4964 generic.go:334] "Generic (PLEG): container finished" podID="108d1f0d-1354-4b1b-a3c3-fa5b14cee77f" containerID="3275db9108f054d105232b0ab7ec585a836ba4fc150056290b5ca46c34600d1a" exitCode=0 Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.028369 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" event={"ID":"108d1f0d-1354-4b1b-a3c3-fa5b14cee77f","Type":"ContainerDied","Data":"3275db9108f054d105232b0ab7ec585a836ba4fc150056290b5ca46c34600d1a"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.056986 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" event={"ID":"070aadf4-bcde-4da7-bbac-6937b7e6937f","Type":"ContainerStarted","Data":"880930e6ad0a618e78069a94d02eb2c442cc63be23775798c12366aeb7adbeb6"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.067412 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" podStartSLOduration=124.067395596 podStartE2EDuration="2m4.067395596s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.065968229 +0000 UTC m=+144.962926867" watchObservedRunningTime="2025-10-04 02:42:45.067395596 +0000 UTC m=+144.964354234" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.091872 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" event={"ID":"1fa08c06-83f8-4256-b721-af9b30f9f915","Type":"ContainerStarted","Data":"dc81ab2259eacc06d190486e96a501c80b6e23758863d3887f2007b1d85e88e5"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.100099 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qtbhx" event={"ID":"8e01310a-918b-4577-90cd-3e85c149008f","Type":"ContainerStarted","Data":"dafa53f5192388f4f32e9193981d84abd05458b473852211682bab00ae55e5c6"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.114758 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.115050 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.615030874 +0000 UTC m=+145.511989582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.115106 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" event={"ID":"0f7e7f85-336c-4706-ad26-f218ca626bed","Type":"ContainerStarted","Data":"b8ce50dc8a186c7cd9a49f05982c3c66df4d708c218cb8ec44ed3e776c713aa7"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.115280 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.119250 4964 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jbtzt container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.119290 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" podUID="0f7e7f85-336c-4706-ad26-f218ca626bed" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.126898 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cznct" event={"ID":"d30aed64-b8f7-4028-8dfc-f3661ce1c459","Type":"ContainerStarted","Data":"57bdf1591cedaa3cb662241ee331590b756a3757be10245cdb11915ea4ffc3f1"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.136103 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" event={"ID":"7cba096c-8054-4ad9-bb23-185f18482afb","Type":"ContainerStarted","Data":"f41bf67d16d1cfb437f08d748401a1bd5db877b4517fb71a45c2dadc22aa6cca"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.150139 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" podStartSLOduration=124.150120207 podStartE2EDuration="2m4.150120207s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.111361655 +0000 UTC m=+145.008320293" watchObservedRunningTime="2025-10-04 02:42:45.150120207 +0000 UTC m=+145.047078845" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.166194 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" podStartSLOduration=124.166175936 podStartE2EDuration="2m4.166175936s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.137451298 +0000 UTC m=+145.034409936" watchObservedRunningTime="2025-10-04 02:42:45.166175936 +0000 UTC m=+145.063134574" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.187478 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pg8r8" event={"ID":"c5873894-8da0-464d-9a26-adad29928a59","Type":"ContainerStarted","Data":"baa06be482536c8c92f0d44f11b597ef100c0da5973b2233e304b9e14748ce65"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.194761 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xgvpt" event={"ID":"2e961f5c-6d04-462b-9097-346efdfe347c","Type":"ContainerStarted","Data":"0669a493aa9b5df48b432e2524e5f4f6e0621715ffae87e27591fef841b72892"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.196555 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-xgvpt" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.197029 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-j54j2" podStartSLOduration=124.197018722 podStartE2EDuration="2m4.197018722s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.192177776 +0000 UTC m=+145.089136414" watchObservedRunningTime="2025-10-04 02:42:45.197018722 +0000 UTC m=+145.093977360" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.210779 4964 patch_prober.go:28] interesting pod/downloads-7954f5f757-xgvpt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.210879 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xgvpt" podUID="2e961f5c-6d04-462b-9097-346efdfe347c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.216592 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.216712 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.716688968 +0000 UTC m=+145.613647606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.217269 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.219426 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" event={"ID":"1d532dfb-e972-4fcb-a141-0c27f317505f","Type":"ContainerStarted","Data":"b1235c7345a9c1dbfcaf07b6457babd88f8353626830cd63657779a10b090cc0"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.219699 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" event={"ID":"1d532dfb-e972-4fcb-a141-0c27f317505f","Type":"ContainerStarted","Data":"bde43553dea41cb6cfd035341afbec9b6fd84bb14e700685310c61dbcabbd9d4"} Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.219821 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.719800568 +0000 UTC m=+145.616759196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.224990 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4zll7" event={"ID":"f377b1b3-e592-418c-af18-6cd9c169d9c5","Type":"ContainerStarted","Data":"dcba9da67928b88c5e152c69f74cbb595c2e378bd105b45579e6c24c495d2f01"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.225751 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.233795 4964 patch_prober.go:28] interesting pod/console-operator-58897d9998-4zll7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.233865 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4zll7" podUID="f377b1b3-e592-418c-af18-6cd9c169d9c5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.235794 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gtc9j" podStartSLOduration=124.235780974 podStartE2EDuration="2m4.235780974s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.234039818 +0000 UTC m=+145.130998456" watchObservedRunningTime="2025-10-04 02:42:45.235780974 +0000 UTC m=+145.132739612" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.246152 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d" event={"ID":"e624ca3f-538e-443a-8f2e-aa64988c9ce4","Type":"ContainerStarted","Data":"340accc0e23d9ff2952b685ea64fe6cbb9369d7887b2a05da17ead82a806ce73"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.262851 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" event={"ID":"25b0c75e-6790-4622-80d9-d1182608eb38","Type":"ContainerStarted","Data":"88ab718943bee65d515338645e4b90d2c04dc73335d568eff2207df1b9ad3a0c"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.262899 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" event={"ID":"25b0c75e-6790-4622-80d9-d1182608eb38","Type":"ContainerStarted","Data":"bfa5fe449732d3b685206d02edc97bacfaf59f2ac47dbd6d10c7f1acd1f6d8fa"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.286851 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" event={"ID":"d80374aa-c786-4e00-bb87-ac0998a61bc0","Type":"ContainerStarted","Data":"0660692929b8d180b0aa76bc8834362e4ff67668ce047ae37fbf6ee52a305f04"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.300995 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd" event={"ID":"d09e8523-afbc-4f5d-888d-92b350c15f7c","Type":"ContainerStarted","Data":"2ff7ee03b5357f26cb6e33634b6ba21af98f8d3a51b2cb2e2e9f58359dc71cd8"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.302757 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wrktt" event={"ID":"3756ce85-62bf-4b3b-94fc-ec155c12c913","Type":"ContainerStarted","Data":"44022555b129ee6d78058eaa8399e683e5427dc1cfd37649d893812929fed17f"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.316960 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" event={"ID":"3c16d39c-7067-4368-b4f6-324d9612c4de","Type":"ContainerStarted","Data":"738a07c4ebe0fcde3a83ce0875a7316d2f422294de2d725423207e716ff9a265"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.317018 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" event={"ID":"3c16d39c-7067-4368-b4f6-324d9612c4de","Type":"ContainerStarted","Data":"37937740883d06513fedf7b9e380998965ddd183ce1c9c608308f3cfc7ae5c4c"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.318107 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.318310 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.818285649 +0000 UTC m=+145.715244287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.320655 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.320938 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.820926384 +0000 UTC m=+145.717885022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.323607 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qtbhx" podStartSLOduration=6.32358851 podStartE2EDuration="6.32358851s" podCreationTimestamp="2025-10-04 02:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.316521563 +0000 UTC m=+145.213480211" watchObservedRunningTime="2025-10-04 02:42:45.32358851 +0000 UTC m=+145.220547138" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.335692 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" event={"ID":"8c9da68e-d45d-44c9-be51-0b8a38042692","Type":"ContainerStarted","Data":"ace13eb154077a9ce9b1998b6d84d3d3d9a49ba81aab6b3b7a367ba7af151fc4"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.346209 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" event={"ID":"b564efb8-50db-4d84-a456-4857001ab84a","Type":"ContainerStarted","Data":"d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.346676 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.350192 4964 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2zdrf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.350250 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" podUID="b564efb8-50db-4d84-a456-4857001ab84a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.350905 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dp6lk" event={"ID":"3f005aad-20eb-493e-8b74-fb1cf25030aa","Type":"ContainerStarted","Data":"082fb27281ec703a3193240e6a5a5741a65c55b4837bf5e4eabf1077a644bd4e"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.350945 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dp6lk" event={"ID":"3f005aad-20eb-493e-8b74-fb1cf25030aa","Type":"ContainerStarted","Data":"e7af2efe4c5deb328c9cd697a0c9e9601f6544596acf6cffb3545e3d30905f28"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.356209 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-khlb2" event={"ID":"c1430fe6-c0d6-4356-8aa3-b3c06f738c2f","Type":"ContainerStarted","Data":"1cc08b6a42394307d5b02eee49120070ee277c79804fbe13cad1d11c269a6da0"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.360179 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" event={"ID":"38588578-4883-453a-8a92-1bef1dc0f479","Type":"ContainerStarted","Data":"f39785059cef79bb3effcaa49709692f79635c052cdf4828e4e48d285a9cfca3"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.365047 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-flq6t" event={"ID":"a9e3ab37-3e49-432d-9c9e-de56eafc9591","Type":"ContainerStarted","Data":"7e4145f44499c230841fc2d8dfa0c3d6885dbbf296fc017fbb0a62ddc2841067"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.367914 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" event={"ID":"c7b419e3-b339-4a82-8cb1-c14467712c1f","Type":"ContainerStarted","Data":"78ec000dcd20eb17f9deb299cce5dbe68a7bbd40ef1808db1f07120a7481c3ab"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.369347 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.370572 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" event={"ID":"31a61679-e5a4-444f-a77d-bd158e7a1dce","Type":"ContainerStarted","Data":"80fc1074bb093a8442ddc3679478cd34f67ef8b27625cd5541d9446030d8d2e8"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.373739 4964 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-rm5rh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.373786 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" podUID="c7b419e3-b339-4a82-8cb1-c14467712c1f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.379599 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" event={"ID":"59e47053-7257-4a23-81d0-c4965cde15bf","Type":"ContainerStarted","Data":"2b4db783fb3f3fd0314cfd03a26760a736cbef7cf9c9f645442eb7c1e8269fc7"} Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.384283 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" podStartSLOduration=124.384273111 podStartE2EDuration="2m4.384273111s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.341111896 +0000 UTC m=+145.238070544" watchObservedRunningTime="2025-10-04 02:42:45.384273111 +0000 UTC m=+145.281231749" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.385010 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-cznct" podStartSLOduration=124.385007144 podStartE2EDuration="2m4.385007144s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.384161687 +0000 UTC m=+145.281120325" watchObservedRunningTime="2025-10-04 02:42:45.385007144 +0000 UTC m=+145.281965782" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.427431 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.429061 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:45.929044997 +0000 UTC m=+145.826003635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.429757 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vk6lb" podStartSLOduration=124.429741149 podStartE2EDuration="2m4.429741149s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.427008131 +0000 UTC m=+145.323966779" watchObservedRunningTime="2025-10-04 02:42:45.429741149 +0000 UTC m=+145.326699787" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.505066 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vb4qk" podStartSLOduration=124.505042692 podStartE2EDuration="2m4.505042692s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.502022174 +0000 UTC m=+145.398980812" watchObservedRunningTime="2025-10-04 02:42:45.505042692 +0000 UTC m=+145.402001330" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.529314 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.529765 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.02974787 +0000 UTC m=+145.926706508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.546508 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wrktt" podStartSLOduration=6.546491991 podStartE2EDuration="6.546491991s" podCreationTimestamp="2025-10-04 02:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.54366975 +0000 UTC m=+145.440628398" watchObservedRunningTime="2025-10-04 02:42:45.546491991 +0000 UTC m=+145.443450619" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.620961 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vng4p" podStartSLOduration=124.620942415 podStartE2EDuration="2m4.620942415s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.620593564 +0000 UTC m=+145.517552192" watchObservedRunningTime="2025-10-04 02:42:45.620942415 +0000 UTC m=+145.517901053" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.622239 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" podStartSLOduration=124.622229797 podStartE2EDuration="2m4.622229797s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.585973896 +0000 UTC m=+145.482932534" watchObservedRunningTime="2025-10-04 02:42:45.622229797 +0000 UTC m=+145.519188435" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.630561 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.630724 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.13069373 +0000 UTC m=+146.027652368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.630972 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.631377 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.131362192 +0000 UTC m=+146.028320830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.664051 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-xgvpt" podStartSLOduration=124.664034147 podStartE2EDuration="2m4.664034147s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.663978966 +0000 UTC m=+145.560937604" watchObservedRunningTime="2025-10-04 02:42:45.664034147 +0000 UTC m=+145.560992785" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.733605 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.733747 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.233720738 +0000 UTC m=+146.130679376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.733872 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.734139 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.234126031 +0000 UTC m=+146.131084679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.744058 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-khlb2" podStartSLOduration=124.744039031 podStartE2EDuration="2m4.744039031s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.7430658 +0000 UTC m=+145.640024438" watchObservedRunningTime="2025-10-04 02:42:45.744039031 +0000 UTC m=+145.640997669" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.744968 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" podStartSLOduration=124.744959931 podStartE2EDuration="2m4.744959931s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.717863086 +0000 UTC m=+145.614821724" watchObservedRunningTime="2025-10-04 02:42:45.744959931 +0000 UTC m=+145.641918569" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.785012 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-4zll7" podStartSLOduration=124.784999334 podStartE2EDuration="2m4.784999334s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.784785498 +0000 UTC m=+145.681744126" watchObservedRunningTime="2025-10-04 02:42:45.784999334 +0000 UTC m=+145.681957972" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.831653 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkldd" podStartSLOduration=124.831634411 podStartE2EDuration="2m4.831634411s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.829893034 +0000 UTC m=+145.726851672" watchObservedRunningTime="2025-10-04 02:42:45.831634411 +0000 UTC m=+145.728593049" Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.835132 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.835473 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.335457224 +0000 UTC m=+146.232415862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.937364 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:45 crc kubenswrapper[4964]: E1004 02:42:45.938018 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.438007257 +0000 UTC m=+146.334965895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:45 crc kubenswrapper[4964]: I1004 02:42:45.956366 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" podStartSLOduration=124.956349419 podStartE2EDuration="2m4.956349419s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:45.940738624 +0000 UTC m=+145.837697262" watchObservedRunningTime="2025-10-04 02:42:45.956349419 +0000 UTC m=+145.853308057" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.038284 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.038465 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.53844259 +0000 UTC m=+146.435401228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.038639 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.038920 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.538907896 +0000 UTC m=+146.435866534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.045448 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.047688 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.047739 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.063101 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d" podStartSLOduration=125.063085736 podStartE2EDuration="2m5.063085736s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.040876679 +0000 UTC m=+145.937835317" watchObservedRunningTime="2025-10-04 02:42:46.063085736 +0000 UTC m=+145.960044374" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.063543 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-prlpt" podStartSLOduration=125.063540241 podStartE2EDuration="2m5.063540241s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.06163346 +0000 UTC m=+145.958592098" watchObservedRunningTime="2025-10-04 02:42:46.063540241 +0000 UTC m=+145.960498879" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.140094 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.140291 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.64026066 +0000 UTC m=+146.537219298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.140520 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.140848 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.640841418 +0000 UTC m=+146.537800056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.241992 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.242161 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.74213669 +0000 UTC m=+146.639095328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.242403 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.242709 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.742701148 +0000 UTC m=+146.639659786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.343752 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.343930 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.843905987 +0000 UTC m=+146.740864625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.343978 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.344301 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.84428732 +0000 UTC m=+146.741245958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.386072 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" event={"ID":"25b0c75e-6790-4622-80d9-d1182608eb38","Type":"ContainerStarted","Data":"51d274e5b05858c061bd7ad5a1d178f3f2b3e6e71b509a881e953c057711ae16"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.386188 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.387896 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" event={"ID":"108d1f0d-1354-4b1b-a3c3-fa5b14cee77f","Type":"ContainerStarted","Data":"b2b31fab21ca550aede0618d9fb7441bd718830795150451da1bf04111108b90"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.388010 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.389137 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" event={"ID":"d80374aa-c786-4e00-bb87-ac0998a61bc0","Type":"ContainerStarted","Data":"fe81404054f800d24c76486e1617aaa145d4b23ee4258e352f09c8a1c12ffb70"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.390399 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-flq6t" event={"ID":"a9e3ab37-3e49-432d-9c9e-de56eafc9591","Type":"ContainerStarted","Data":"af93c50bdf3f97500b1d214392728a8d0c16a6a178fd177454be3d1e74668701"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.390428 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-flq6t" event={"ID":"a9e3ab37-3e49-432d-9c9e-de56eafc9591","Type":"ContainerStarted","Data":"8a9a35d0b4ca772bba1003c9dcb609682aa5033c8d817ae0c5fbfddff387076a"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.390504 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.392201 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" event={"ID":"8c9da68e-d45d-44c9-be51-0b8a38042692","Type":"ContainerStarted","Data":"18ec1ee369adde42712a7d99bd325731a620c1f671bcbeb7e074137e16fc8669"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.393555 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" event={"ID":"7cba096c-8054-4ad9-bb23-185f18482afb","Type":"ContainerStarted","Data":"e7e2f9be950b13dc1422916106e8ffae70c7b7f3a3a9482e31890bb5d414d37c"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.397520 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dp6lk" event={"ID":"3f005aad-20eb-493e-8b74-fb1cf25030aa","Type":"ContainerStarted","Data":"b85d9dd32b7d909def45bc7872f830655e109a01a64d2ec432f358b2d8ebdd65"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.402118 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" event={"ID":"1fa08c06-83f8-4256-b721-af9b30f9f915","Type":"ContainerStarted","Data":"1501c4657ca1a29466bd0abc407960b217ba228ca2a323ac19afe0e0c32faf44"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.405121 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pg8r8" event={"ID":"c5873894-8da0-464d-9a26-adad29928a59","Type":"ContainerStarted","Data":"7ea444097909568fe18c623f984a26b90fffa7b195f3b920b1f67aa6caa85db4"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.409927 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" event={"ID":"6018b91d-752d-4b19-9121-705d34195d35","Type":"ContainerStarted","Data":"723774d3994dc1e0d10aab3e3a7b450b040b514f303a17ccb3f431633ac314d8"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.416985 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-br64d" event={"ID":"e624ca3f-538e-443a-8f2e-aa64988c9ce4","Type":"ContainerStarted","Data":"771343278f1e8de4c9f2a9bfa91fa4dff360620ea99a3e975489a583e430f32f"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.441637 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" event={"ID":"c90c99ad-ac5c-4b40-8eba-e10fa92c1059","Type":"ContainerStarted","Data":"a8f302913b40e59cbe73c53974b174d64559624c4c0166385a958fb690dfbb0e"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.445398 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.445671 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.945651884 +0000 UTC m=+146.842610522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.445933 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.447793 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:46.947779622 +0000 UTC m=+146.844738260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.454870 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" event={"ID":"1d532dfb-e972-4fcb-a141-0c27f317505f","Type":"ContainerStarted","Data":"13e16d8a5821b92bb4adac576bf4bd2a338676e6b7d0e2630cb20fa9238a0b01"} Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.456988 4964 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-t4p6t container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.457024 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" podUID="bafaf93c-8a25-4b7e-8791-7d10b46ee161" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.457045 4964 patch_prober.go:28] interesting pod/console-operator-58897d9998-4zll7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.457110 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4zll7" podUID="f377b1b3-e592-418c-af18-6cd9c169d9c5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.457279 4964 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jbtzt container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.457304 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" podUID="0f7e7f85-336c-4706-ad26-f218ca626bed" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.457469 4964 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qhj8g container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.457500 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" podUID="2c097e98-163f-4ede-90a4-c9aa7318aaa0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.459661 4964 patch_prober.go:28] interesting pod/downloads-7954f5f757-xgvpt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.459707 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xgvpt" podUID="2e961f5c-6d04-462b-9097-346efdfe347c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.459769 4964 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-d78cz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.459784 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" podUID="bb432ba7-089d-40a7-a0a7-43b3217a2527" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.486793 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" podStartSLOduration=125.486759911 podStartE2EDuration="2m5.486759911s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.440856019 +0000 UTC m=+146.337814657" watchObservedRunningTime="2025-10-04 02:42:46.486759911 +0000 UTC m=+146.383718549" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.487526 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpssm" podStartSLOduration=125.487522456 podStartE2EDuration="2m5.487522456s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.486033128 +0000 UTC m=+146.382991766" watchObservedRunningTime="2025-10-04 02:42:46.487522456 +0000 UTC m=+146.384481094" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.526599 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-4crxp" podStartSLOduration=125.526574038 podStartE2EDuration="2m5.526574038s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.526084971 +0000 UTC m=+146.423043609" watchObservedRunningTime="2025-10-04 02:42:46.526574038 +0000 UTC m=+146.423532666" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.548173 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.549867 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:47.049848329 +0000 UTC m=+146.946806967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.637734 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-pg8r8" podStartSLOduration=125.637714437 podStartE2EDuration="2m5.637714437s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.583843427 +0000 UTC m=+146.480802065" watchObservedRunningTime="2025-10-04 02:42:46.637714437 +0000 UTC m=+146.534673075" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.638376 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dp6lk" podStartSLOduration=125.638372089 podStartE2EDuration="2m5.638372089s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.635570939 +0000 UTC m=+146.532529567" watchObservedRunningTime="2025-10-04 02:42:46.638372089 +0000 UTC m=+146.535330727" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.651271 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.651529 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:47.151518273 +0000 UTC m=+147.048476911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.688871 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-flq6t" podStartSLOduration=7.68885108 podStartE2EDuration="7.68885108s" podCreationTimestamp="2025-10-04 02:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.68329806 +0000 UTC m=+146.580256698" watchObservedRunningTime="2025-10-04 02:42:46.68885108 +0000 UTC m=+146.585809708" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.747714 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" podStartSLOduration=125.74769629 podStartE2EDuration="2m5.74769629s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.742072419 +0000 UTC m=+146.639031057" watchObservedRunningTime="2025-10-04 02:42:46.74769629 +0000 UTC m=+146.644654928" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.752300 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.752705 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:47.252689131 +0000 UTC m=+147.149647769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.805975 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mtqzr" podStartSLOduration=125.805958081 podStartE2EDuration="2m5.805958081s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.805128945 +0000 UTC m=+146.702087583" watchObservedRunningTime="2025-10-04 02:42:46.805958081 +0000 UTC m=+146.702916719" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.854436 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.854824 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:47.35481277 +0000 UTC m=+147.251771408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.862907 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gs4qw" podStartSLOduration=125.862891161 podStartE2EDuration="2m5.862891161s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.834344508 +0000 UTC m=+146.731303146" watchObservedRunningTime="2025-10-04 02:42:46.862891161 +0000 UTC m=+146.759849799" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.864515 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-sjkbq" podStartSLOduration=125.864508093 podStartE2EDuration="2m5.864508093s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.86316751 +0000 UTC m=+146.760126148" watchObservedRunningTime="2025-10-04 02:42:46.864508093 +0000 UTC m=+146.761466731" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.872153 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.915550 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" podStartSLOduration=125.915535241 podStartE2EDuration="2m5.915535241s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.913289229 +0000 UTC m=+146.810247867" watchObservedRunningTime="2025-10-04 02:42:46.915535241 +0000 UTC m=+146.812493879" Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.956849 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:46 crc kubenswrapper[4964]: E1004 02:42:46.957273 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:47.457258258 +0000 UTC m=+147.354216896 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:46 crc kubenswrapper[4964]: I1004 02:42:46.998901 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zrngt" podStartSLOduration=125.998882933 podStartE2EDuration="2m5.998882933s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:46.940157506 +0000 UTC m=+146.837116144" watchObservedRunningTime="2025-10-04 02:42:46.998882933 +0000 UTC m=+146.895841571" Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.059826 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:47 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:47 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:47 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.059884 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.060436 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.060493 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.061007 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:47 crc kubenswrapper[4964]: E1004 02:42:47.061331 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:47.561320141 +0000 UTC m=+147.458278779 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.064014 4964 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-ffdcs container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.064054 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" podUID="c90c99ad-ac5c-4b40-8eba-e10fa92c1059" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.070950 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.164366 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:47 crc kubenswrapper[4964]: E1004 02:42:47.164974 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:47.664957928 +0000 UTC m=+147.561916566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.267314 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:47 crc kubenswrapper[4964]: E1004 02:42:47.267682 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:47.767669415 +0000 UTC m=+147.664628053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.368964 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:47 crc kubenswrapper[4964]: E1004 02:42:47.369607 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:47.869582548 +0000 UTC m=+147.766541186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.459854 4964 generic.go:334] "Generic (PLEG): container finished" podID="77f14c97-6ee6-4ff8-b98d-3c8cd595b994" containerID="a06cc299f9052b11b553a1e032f51ab5b8b257d86e053b9d70fe6eec68471cdb" exitCode=0 Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.459924 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" event={"ID":"77f14c97-6ee6-4ff8-b98d-3c8cd595b994","Type":"ContainerDied","Data":"a06cc299f9052b11b553a1e032f51ab5b8b257d86e053b9d70fe6eec68471cdb"} Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.461537 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" event={"ID":"31a61679-e5a4-444f-a77d-bd158e7a1dce","Type":"ContainerStarted","Data":"70074403dcbca82299e186556c18c03c3f0a897511efacd53bd5e8fbe1bcc1d7"} Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.468056 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t4p6t" Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.470111 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhj8g" Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.470399 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:47 crc kubenswrapper[4964]: E1004 02:42:47.470739 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:47.970725834 +0000 UTC m=+147.867684472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.571280 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:47 crc kubenswrapper[4964]: E1004 02:42:47.571463 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.071437118 +0000 UTC m=+147.968395756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.572242 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:47 crc kubenswrapper[4964]: E1004 02:42:47.576710 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.076694016 +0000 UTC m=+147.973652764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.673784 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:47 crc kubenswrapper[4964]: E1004 02:42:47.674108 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.174092713 +0000 UTC m=+148.071051351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.775516 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:47 crc kubenswrapper[4964]: E1004 02:42:47.775863 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.27585144 +0000 UTC m=+148.172810068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.876106 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:47 crc kubenswrapper[4964]: E1004 02:42:47.876300 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.376271613 +0000 UTC m=+148.273230251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.876434 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:47 crc kubenswrapper[4964]: E1004 02:42:47.876786 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.37677841 +0000 UTC m=+148.273737048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:47 crc kubenswrapper[4964]: I1004 02:42:47.977316 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:47 crc kubenswrapper[4964]: E1004 02:42:47.977686 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.477660169 +0000 UTC m=+148.374618807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.057209 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:48 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:48 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:48 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.057496 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.079589 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:48 crc kubenswrapper[4964]: E1004 02:42:48.079671 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.579655723 +0000 UTC m=+148.476614361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.181033 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:48 crc kubenswrapper[4964]: E1004 02:42:48.181374 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.681358047 +0000 UTC m=+148.578316685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.282028 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:48 crc kubenswrapper[4964]: E1004 02:42:48.282380 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.78236855 +0000 UTC m=+148.679327178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.382655 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:48 crc kubenswrapper[4964]: E1004 02:42:48.383064 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.883049992 +0000 UTC m=+148.780008630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.462438 4964 patch_prober.go:28] interesting pod/console-operator-58897d9998-4zll7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.462506 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4zll7" podUID="f377b1b3-e592-418c-af18-6cd9c169d9c5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.469052 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" event={"ID":"31a61679-e5a4-444f-a77d-bd158e7a1dce","Type":"ContainerStarted","Data":"292678e64af474aa4de17e46a10b480793025ba12ef742faf766d877b0d53510"} Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.469088 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" event={"ID":"31a61679-e5a4-444f-a77d-bd158e7a1dce","Type":"ContainerStarted","Data":"b6b8d15fcd339081b83d011a5df67b1b2890b8a1176780284e0116711a2082a6"} Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.469098 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" event={"ID":"31a61679-e5a4-444f-a77d-bd158e7a1dce","Type":"ContainerStarted","Data":"2c42ef0cd7e08bbb021167f4e3eebba10d4f1e661363b7f185f055e75aee8201"} Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.484451 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:48 crc kubenswrapper[4964]: E1004 02:42:48.484758 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:48.984748058 +0000 UTC m=+148.881706696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.577813 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.585767 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:48 crc kubenswrapper[4964]: E1004 02:42:48.586411 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:49.086396971 +0000 UTC m=+148.983355609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.650871 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-rlnbj" podStartSLOduration=9.650857273 podStartE2EDuration="9.650857273s" podCreationTimestamp="2025-10-04 02:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:48.514911401 +0000 UTC m=+148.411870039" watchObservedRunningTime="2025-10-04 02:42:48.650857273 +0000 UTC m=+148.547815911" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.690389 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.690432 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.690507 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:48 crc kubenswrapper[4964]: E1004 02:42:48.690788 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:49.190777352 +0000 UTC m=+149.087735980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.692510 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.703053 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.720106 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lfrsh" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.791343 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:48 crc kubenswrapper[4964]: E1004 02:42:48.791497 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:49.291466294 +0000 UTC m=+149.188424932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.791547 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.791794 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.791874 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:48 crc kubenswrapper[4964]: E1004 02:42:48.792783 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:49.292775667 +0000 UTC m=+149.189734305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.795510 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.795998 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.870095 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.882101 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.892388 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:48 crc kubenswrapper[4964]: E1004 02:42:48.892725 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:49.392710565 +0000 UTC m=+149.289669203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.930411 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8k555"] Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.931267 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.942166 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.944881 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8k555"] Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.966581 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.996476 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:48 crc kubenswrapper[4964]: E1004 02:42:48.996952 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:49.496939321 +0000 UTC m=+149.393897959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.997118 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29d99f96-8f30-452b-9ca2-1f0c640380f8-utilities\") pod \"certified-operators-8k555\" (UID: \"29d99f96-8f30-452b-9ca2-1f0c640380f8\") " pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.997136 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29d99f96-8f30-452b-9ca2-1f0c640380f8-catalog-content\") pod \"certified-operators-8k555\" (UID: \"29d99f96-8f30-452b-9ca2-1f0c640380f8\") " pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:42:48 crc kubenswrapper[4964]: I1004 02:42:48.997178 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztbpw\" (UniqueName: \"kubernetes.io/projected/29d99f96-8f30-452b-9ca2-1f0c640380f8-kube-api-access-ztbpw\") pod \"certified-operators-8k555\" (UID: \"29d99f96-8f30-452b-9ca2-1f0c640380f8\") " pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.057582 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:49 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:49 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:49 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.057650 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.058276 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.108027 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-secret-volume\") pod \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.108277 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.108306 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhmq6\" (UniqueName: \"kubernetes.io/projected/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-kube-api-access-vhmq6\") pod \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.108337 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-config-volume\") pod \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\" (UID: \"77f14c97-6ee6-4ff8-b98d-3c8cd595b994\") " Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.108459 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztbpw\" (UniqueName: \"kubernetes.io/projected/29d99f96-8f30-452b-9ca2-1f0c640380f8-kube-api-access-ztbpw\") pod \"certified-operators-8k555\" (UID: \"29d99f96-8f30-452b-9ca2-1f0c640380f8\") " pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.108534 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29d99f96-8f30-452b-9ca2-1f0c640380f8-utilities\") pod \"certified-operators-8k555\" (UID: \"29d99f96-8f30-452b-9ca2-1f0c640380f8\") " pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.108548 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29d99f96-8f30-452b-9ca2-1f0c640380f8-catalog-content\") pod \"certified-operators-8k555\" (UID: \"29d99f96-8f30-452b-9ca2-1f0c640380f8\") " pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.108932 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29d99f96-8f30-452b-9ca2-1f0c640380f8-catalog-content\") pod \"certified-operators-8k555\" (UID: \"29d99f96-8f30-452b-9ca2-1f0c640380f8\") " pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.113159 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-config-volume" (OuterVolumeSpecName: "config-volume") pod "77f14c97-6ee6-4ff8-b98d-3c8cd595b994" (UID: "77f14c97-6ee6-4ff8-b98d-3c8cd595b994"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.113266 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:49.613246698 +0000 UTC m=+149.510205336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.113996 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29d99f96-8f30-452b-9ca2-1f0c640380f8-utilities\") pod \"certified-operators-8k555\" (UID: \"29d99f96-8f30-452b-9ca2-1f0c640380f8\") " pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.123410 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-kube-api-access-vhmq6" (OuterVolumeSpecName: "kube-api-access-vhmq6") pod "77f14c97-6ee6-4ff8-b98d-3c8cd595b994" (UID: "77f14c97-6ee6-4ff8-b98d-3c8cd595b994"). InnerVolumeSpecName "kube-api-access-vhmq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.126269 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9gg8f"] Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.126442 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f14c97-6ee6-4ff8-b98d-3c8cd595b994" containerName="collect-profiles" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.126458 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f14c97-6ee6-4ff8-b98d-3c8cd595b994" containerName="collect-profiles" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.126532 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="77f14c97-6ee6-4ff8-b98d-3c8cd595b994" containerName="collect-profiles" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.127394 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.132300 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "77f14c97-6ee6-4ff8-b98d-3c8cd595b994" (UID: "77f14c97-6ee6-4ff8-b98d-3c8cd595b994"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.134960 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.146347 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztbpw\" (UniqueName: \"kubernetes.io/projected/29d99f96-8f30-452b-9ca2-1f0c640380f8-kube-api-access-ztbpw\") pod \"certified-operators-8k555\" (UID: \"29d99f96-8f30-452b-9ca2-1f0c640380f8\") " pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.155966 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9gg8f"] Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.210240 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.210293 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78pn6\" (UniqueName: \"kubernetes.io/projected/698d3183-93e5-4693-8b24-cc507a41d274-kube-api-access-78pn6\") pod \"community-operators-9gg8f\" (UID: \"698d3183-93e5-4693-8b24-cc507a41d274\") " pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.210315 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698d3183-93e5-4693-8b24-cc507a41d274-catalog-content\") pod \"community-operators-9gg8f\" (UID: \"698d3183-93e5-4693-8b24-cc507a41d274\") " pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.210368 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698d3183-93e5-4693-8b24-cc507a41d274-utilities\") pod \"community-operators-9gg8f\" (UID: \"698d3183-93e5-4693-8b24-cc507a41d274\") " pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.210402 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhmq6\" (UniqueName: \"kubernetes.io/projected/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-kube-api-access-vhmq6\") on node \"crc\" DevicePath \"\"" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.210413 4964 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.210421 4964 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77f14c97-6ee6-4ff8-b98d-3c8cd595b994-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.210685 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:49.710671675 +0000 UTC m=+149.607630313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.251043 4964 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.260088 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.313922 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.314154 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:49.814128357 +0000 UTC m=+149.711086995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.314373 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698d3183-93e5-4693-8b24-cc507a41d274-utilities\") pod \"community-operators-9gg8f\" (UID: \"698d3183-93e5-4693-8b24-cc507a41d274\") " pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.314414 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.314446 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78pn6\" (UniqueName: \"kubernetes.io/projected/698d3183-93e5-4693-8b24-cc507a41d274-kube-api-access-78pn6\") pod \"community-operators-9gg8f\" (UID: \"698d3183-93e5-4693-8b24-cc507a41d274\") " pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.314461 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698d3183-93e5-4693-8b24-cc507a41d274-catalog-content\") pod \"community-operators-9gg8f\" (UID: \"698d3183-93e5-4693-8b24-cc507a41d274\") " pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.314818 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698d3183-93e5-4693-8b24-cc507a41d274-catalog-content\") pod \"community-operators-9gg8f\" (UID: \"698d3183-93e5-4693-8b24-cc507a41d274\") " pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.315009 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698d3183-93e5-4693-8b24-cc507a41d274-utilities\") pod \"community-operators-9gg8f\" (UID: \"698d3183-93e5-4693-8b24-cc507a41d274\") " pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.315215 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:49.815204341 +0000 UTC m=+149.712162979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.340993 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xzv28"] Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.341868 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.387405 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78pn6\" (UniqueName: \"kubernetes.io/projected/698d3183-93e5-4693-8b24-cc507a41d274-kube-api-access-78pn6\") pod \"community-operators-9gg8f\" (UID: \"698d3183-93e5-4693-8b24-cc507a41d274\") " pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.412947 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xzv28"] Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.414977 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.415130 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39c2152-b733-43f5-acd7-75e8948518f0-utilities\") pod \"certified-operators-xzv28\" (UID: \"b39c2152-b733-43f5-acd7-75e8948518f0\") " pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.415183 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39c2152-b733-43f5-acd7-75e8948518f0-catalog-content\") pod \"certified-operators-xzv28\" (UID: \"b39c2152-b733-43f5-acd7-75e8948518f0\") " pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.415221 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4rvc\" (UniqueName: \"kubernetes.io/projected/b39c2152-b733-43f5-acd7-75e8948518f0-kube-api-access-j4rvc\") pod \"certified-operators-xzv28\" (UID: \"b39c2152-b733-43f5-acd7-75e8948518f0\") " pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.415309 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:49.915292835 +0000 UTC m=+149.812251473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.450239 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.518455 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hw9j9"] Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.518931 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.519073 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39c2152-b733-43f5-acd7-75e8948518f0-utilities\") pod \"certified-operators-xzv28\" (UID: \"b39c2152-b733-43f5-acd7-75e8948518f0\") " pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.519224 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39c2152-b733-43f5-acd7-75e8948518f0-catalog-content\") pod \"certified-operators-xzv28\" (UID: \"b39c2152-b733-43f5-acd7-75e8948518f0\") " pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.519373 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4rvc\" (UniqueName: \"kubernetes.io/projected/b39c2152-b733-43f5-acd7-75e8948518f0-kube-api-access-j4rvc\") pod \"certified-operators-xzv28\" (UID: \"b39c2152-b733-43f5-acd7-75e8948518f0\") " pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.519405 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.520645 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39c2152-b733-43f5-acd7-75e8948518f0-catalog-content\") pod \"certified-operators-xzv28\" (UID: \"b39c2152-b733-43f5-acd7-75e8948518f0\") " pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.520823 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:50.020806582 +0000 UTC m=+149.917765220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.520825 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" event={"ID":"77f14c97-6ee6-4ff8-b98d-3c8cd595b994","Type":"ContainerDied","Data":"b56918fe5c57378f118da363c3fecf1b2764b53e90b4c4a691a3a6ebcaa11043"} Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.520869 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b56918fe5c57378f118da363c3fecf1b2764b53e90b4c4a691a3a6ebcaa11043" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.520906 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.521083 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39c2152-b733-43f5-acd7-75e8948518f0-utilities\") pod \"certified-operators-xzv28\" (UID: \"b39c2152-b733-43f5-acd7-75e8948518f0\") " pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.522248 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"19baff99db672afef80677036584e3302acf762193e68aee813226353311214f"} Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.523587 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"04bcb8048ab268a2147712bfb0e17dd216a4e632a29e117d8e00a498352582d2"} Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.531024 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hw9j9"] Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.547376 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4rvc\" (UniqueName: \"kubernetes.io/projected/b39c2152-b733-43f5-acd7-75e8948518f0-kube-api-access-j4rvc\") pod \"certified-operators-xzv28\" (UID: \"b39c2152-b733-43f5-acd7-75e8948518f0\") " pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.620344 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.620781 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6858848-2cfa-4910-9078-6d94d3e875d5-utilities\") pod \"community-operators-hw9j9\" (UID: \"a6858848-2cfa-4910-9078-6d94d3e875d5\") " pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.620807 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7cxd\" (UniqueName: \"kubernetes.io/projected/a6858848-2cfa-4910-9078-6d94d3e875d5-kube-api-access-v7cxd\") pod \"community-operators-hw9j9\" (UID: \"a6858848-2cfa-4910-9078-6d94d3e875d5\") " pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.620853 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:50.120827443 +0000 UTC m=+150.017786081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.621049 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.621085 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6858848-2cfa-4910-9078-6d94d3e875d5-catalog-content\") pod \"community-operators-hw9j9\" (UID: \"a6858848-2cfa-4910-9078-6d94d3e875d5\") " pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.621409 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:50.121402061 +0000 UTC m=+150.018360699 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.689482 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.690046 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.692120 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.693822 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.698209 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.710918 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.721825 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.722061 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6858848-2cfa-4910-9078-6d94d3e875d5-catalog-content\") pod \"community-operators-hw9j9\" (UID: \"a6858848-2cfa-4910-9078-6d94d3e875d5\") " pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.722104 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6858848-2cfa-4910-9078-6d94d3e875d5-utilities\") pod \"community-operators-hw9j9\" (UID: \"a6858848-2cfa-4910-9078-6d94d3e875d5\") " pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.722121 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7cxd\" (UniqueName: \"kubernetes.io/projected/a6858848-2cfa-4910-9078-6d94d3e875d5-kube-api-access-v7cxd\") pod \"community-operators-hw9j9\" (UID: \"a6858848-2cfa-4910-9078-6d94d3e875d5\") " pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.722353 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:50.222338042 +0000 UTC m=+150.119296680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.722673 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6858848-2cfa-4910-9078-6d94d3e875d5-catalog-content\") pod \"community-operators-hw9j9\" (UID: \"a6858848-2cfa-4910-9078-6d94d3e875d5\") " pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.722872 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6858848-2cfa-4910-9078-6d94d3e875d5-utilities\") pod \"community-operators-hw9j9\" (UID: \"a6858848-2cfa-4910-9078-6d94d3e875d5\") " pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.742435 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7cxd\" (UniqueName: \"kubernetes.io/projected/a6858848-2cfa-4910-9078-6d94d3e875d5-kube-api-access-v7cxd\") pod \"community-operators-hw9j9\" (UID: \"a6858848-2cfa-4910-9078-6d94d3e875d5\") " pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.775751 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8k555"] Oct 04 02:42:49 crc kubenswrapper[4964]: W1004 02:42:49.783218 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29d99f96_8f30_452b_9ca2_1f0c640380f8.slice/crio-a2dd474d4840c71f88827a5e683cb04f9ec146c2ba5c453a322cfe7991f757cc WatchSource:0}: Error finding container a2dd474d4840c71f88827a5e683cb04f9ec146c2ba5c453a322cfe7991f757cc: Status 404 returned error can't find the container with id a2dd474d4840c71f88827a5e683cb04f9ec146c2ba5c453a322cfe7991f757cc Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.801855 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9gg8f"] Oct 04 02:42:49 crc kubenswrapper[4964]: W1004 02:42:49.807746 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod698d3183_93e5_4693_8b24_cc507a41d274.slice/crio-7747e0f37c1a9d79122f92b2f4dfe9acf0094eea69c7f515f7488bdf58076cf7 WatchSource:0}: Error finding container 7747e0f37c1a9d79122f92b2f4dfe9acf0094eea69c7f515f7488bdf58076cf7: Status 404 returned error can't find the container with id 7747e0f37c1a9d79122f92b2f4dfe9acf0094eea69c7f515f7488bdf58076cf7 Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.824640 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b21a779b-5ac3-45ab-bb35-cd476548696b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b21a779b-5ac3-45ab-bb35-cd476548696b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.825130 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.825176 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b21a779b-5ac3-45ab-bb35-cd476548696b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b21a779b-5ac3-45ab-bb35-cd476548696b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.825185 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:50.325169994 +0000 UTC m=+150.222128632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.882019 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.915324 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xzv28"] Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.926415 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.926599 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:50.426575259 +0000 UTC m=+150.323533897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.926723 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.926762 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b21a779b-5ac3-45ab-bb35-cd476548696b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b21a779b-5ac3-45ab-bb35-cd476548696b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.926828 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b21a779b-5ac3-45ab-bb35-cd476548696b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b21a779b-5ac3-45ab-bb35-cd476548696b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.926916 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b21a779b-5ac3-45ab-bb35-cd476548696b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b21a779b-5ac3-45ab-bb35-cd476548696b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 02:42:49 crc kubenswrapper[4964]: E1004 02:42:49.927193 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:50.427182469 +0000 UTC m=+150.324141107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:49 crc kubenswrapper[4964]: W1004 02:42:49.928349 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb39c2152_b733_43f5_acd7_75e8948518f0.slice/crio-42b2453f3a76abcd98888ca9166737f1e9d3d3f46a81e848491dd94078f83913 WatchSource:0}: Error finding container 42b2453f3a76abcd98888ca9166737f1e9d3d3f46a81e848491dd94078f83913: Status 404 returned error can't find the container with id 42b2453f3a76abcd98888ca9166737f1e9d3d3f46a81e848491dd94078f83913 Oct 04 02:42:49 crc kubenswrapper[4964]: I1004 02:42:49.947562 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b21a779b-5ac3-45ab-bb35-cd476548696b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b21a779b-5ac3-45ab-bb35-cd476548696b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.028402 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:50 crc kubenswrapper[4964]: E1004 02:42:50.028530 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 02:42:50.528512952 +0000 UTC m=+150.425471590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.028765 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:50 crc kubenswrapper[4964]: E1004 02:42:50.029046 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 02:42:50.529039729 +0000 UTC m=+150.425998367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wppjk" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.048262 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:50 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:50 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:50 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.048313 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.048290 4964 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-04T02:42:49.25106439Z","Handler":null,"Name":""} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.050768 4964 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.050795 4964 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.052176 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.052226 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.057599 4964 patch_prober.go:28] interesting pod/apiserver-76f77b778f-8dqhm container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 04 02:42:50 crc kubenswrapper[4964]: [+]log ok Oct 04 02:42:50 crc kubenswrapper[4964]: [+]etcd ok Oct 04 02:42:50 crc kubenswrapper[4964]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 04 02:42:50 crc kubenswrapper[4964]: [+]poststarthook/generic-apiserver-start-informers ok Oct 04 02:42:50 crc kubenswrapper[4964]: [+]poststarthook/max-in-flight-filter ok Oct 04 02:42:50 crc kubenswrapper[4964]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 04 02:42:50 crc kubenswrapper[4964]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 04 02:42:50 crc kubenswrapper[4964]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 04 02:42:50 crc kubenswrapper[4964]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 04 02:42:50 crc kubenswrapper[4964]: [+]poststarthook/project.openshift.io-projectcache ok Oct 04 02:42:50 crc kubenswrapper[4964]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 04 02:42:50 crc kubenswrapper[4964]: [+]poststarthook/openshift.io-startinformers ok Oct 04 02:42:50 crc kubenswrapper[4964]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 04 02:42:50 crc kubenswrapper[4964]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 04 02:42:50 crc kubenswrapper[4964]: livez check failed Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.057676 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" podUID="8c9da68e-d45d-44c9-be51-0b8a38042692" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.073839 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.079679 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hw9j9"] Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.082569 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:42:50 crc kubenswrapper[4964]: W1004 02:42:50.090088 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6858848_2cfa_4910_9078_6d94d3e875d5.slice/crio-a02bd683587accb1194e8f06b4fefee1d23e70400c0b499540a906d13f33a3ed WatchSource:0}: Error finding container a02bd683587accb1194e8f06b4fefee1d23e70400c0b499540a906d13f33a3ed: Status 404 returned error can't find the container with id a02bd683587accb1194e8f06b4fefee1d23e70400c0b499540a906d13f33a3ed Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.129423 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.137747 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.230350 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.234957 4964 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.234994 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.285181 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.289348 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wppjk\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.529848 4964 generic.go:334] "Generic (PLEG): container finished" podID="29d99f96-8f30-452b-9ca2-1f0c640380f8" containerID="be8597e169aeb4bd12bf05772354bc69f2315d1791bb32e570759118ca04e941" exitCode=0 Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.529991 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k555" event={"ID":"29d99f96-8f30-452b-9ca2-1f0c640380f8","Type":"ContainerDied","Data":"be8597e169aeb4bd12bf05772354bc69f2315d1791bb32e570759118ca04e941"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.530057 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k555" event={"ID":"29d99f96-8f30-452b-9ca2-1f0c640380f8","Type":"ContainerStarted","Data":"a2dd474d4840c71f88827a5e683cb04f9ec146c2ba5c453a322cfe7991f757cc"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.531594 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.532285 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b200e3ceae273169dda85fcd259b21b80b9487111077c5e8e47dbb5ad98e1f72"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.532351 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"257b319a14b66eb1f0d19307ac025dd115288b967ea8c2a1594fc5dfadf718fd"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.534190 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b21a779b-5ac3-45ab-bb35-cd476548696b","Type":"ContainerStarted","Data":"f0bdd515e1ac8facf39761905b631860958c113403c22f3e5a869fd16185aacf"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.536250 4964 generic.go:334] "Generic (PLEG): container finished" podID="a6858848-2cfa-4910-9078-6d94d3e875d5" containerID="d96b71f6221b168a08fb080ef4247feb2c16346a966ef3e7a8e1a9eda03ca064" exitCode=0 Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.536337 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw9j9" event={"ID":"a6858848-2cfa-4910-9078-6d94d3e875d5","Type":"ContainerDied","Data":"d96b71f6221b168a08fb080ef4247feb2c16346a966ef3e7a8e1a9eda03ca064"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.536370 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw9j9" event={"ID":"a6858848-2cfa-4910-9078-6d94d3e875d5","Type":"ContainerStarted","Data":"a02bd683587accb1194e8f06b4fefee1d23e70400c0b499540a906d13f33a3ed"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.537938 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.539699 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1f6b20dfea28721604e9c6bad6ef96977b2913ee615778eac5b8afeb6ab17043"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.542118 4964 generic.go:334] "Generic (PLEG): container finished" podID="698d3183-93e5-4693-8b24-cc507a41d274" containerID="fbd8235582b3706b172c24b1ee2b892ceae05ce344f453bb49e333acdfb09f12" exitCode=0 Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.542172 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gg8f" event={"ID":"698d3183-93e5-4693-8b24-cc507a41d274","Type":"ContainerDied","Data":"fbd8235582b3706b172c24b1ee2b892ceae05ce344f453bb49e333acdfb09f12"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.542197 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gg8f" event={"ID":"698d3183-93e5-4693-8b24-cc507a41d274","Type":"ContainerStarted","Data":"7747e0f37c1a9d79122f92b2f4dfe9acf0094eea69c7f515f7488bdf58076cf7"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.544646 4964 generic.go:334] "Generic (PLEG): container finished" podID="b39c2152-b733-43f5-acd7-75e8948518f0" containerID="1af8fb64c4638f8f2592d7cf60a47f7d8510fb2688aa381cb09f29046d575cf4" exitCode=0 Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.544695 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xzv28" event={"ID":"b39c2152-b733-43f5-acd7-75e8948518f0","Type":"ContainerDied","Data":"1af8fb64c4638f8f2592d7cf60a47f7d8510fb2688aa381cb09f29046d575cf4"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.544713 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xzv28" event={"ID":"b39c2152-b733-43f5-acd7-75e8948518f0","Type":"ContainerStarted","Data":"42b2453f3a76abcd98888ca9166737f1e9d3d3f46a81e848491dd94078f83913"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.547231 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c1703abf6eb2aa19b3c0e19e4dc49e7afdcdc64dce05ed6aba9e04a454507255"} Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.547437 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.828652 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wppjk"] Oct 04 02:42:50 crc kubenswrapper[4964]: W1004 02:42:50.848351 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15ca5de7_b5aa_4d86_82b5_122f22b494ee.slice/crio-197680c3c569be03412cf4593f84d7170d199e46ce7d6517d9503cabd51933eb WatchSource:0}: Error finding container 197680c3c569be03412cf4593f84d7170d199e46ce7d6517d9503cabd51933eb: Status 404 returned error can't find the container with id 197680c3c569be03412cf4593f84d7170d199e46ce7d6517d9503cabd51933eb Oct 04 02:42:50 crc kubenswrapper[4964]: I1004 02:42:50.856149 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.048099 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:51 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:51 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:51 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.048164 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.111455 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kljsx"] Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.112600 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.115225 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.121396 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kljsx"] Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.243693 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1213b9a-e51e-4af9-835b-a39b5378ed60-catalog-content\") pod \"redhat-marketplace-kljsx\" (UID: \"e1213b9a-e51e-4af9-835b-a39b5378ed60\") " pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.243814 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfvc7\" (UniqueName: \"kubernetes.io/projected/e1213b9a-e51e-4af9-835b-a39b5378ed60-kube-api-access-sfvc7\") pod \"redhat-marketplace-kljsx\" (UID: \"e1213b9a-e51e-4af9-835b-a39b5378ed60\") " pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.243937 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1213b9a-e51e-4af9-835b-a39b5378ed60-utilities\") pod \"redhat-marketplace-kljsx\" (UID: \"e1213b9a-e51e-4af9-835b-a39b5378ed60\") " pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.346725 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1213b9a-e51e-4af9-835b-a39b5378ed60-utilities\") pod \"redhat-marketplace-kljsx\" (UID: \"e1213b9a-e51e-4af9-835b-a39b5378ed60\") " pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.346820 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1213b9a-e51e-4af9-835b-a39b5378ed60-catalog-content\") pod \"redhat-marketplace-kljsx\" (UID: \"e1213b9a-e51e-4af9-835b-a39b5378ed60\") " pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.346886 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfvc7\" (UniqueName: \"kubernetes.io/projected/e1213b9a-e51e-4af9-835b-a39b5378ed60-kube-api-access-sfvc7\") pod \"redhat-marketplace-kljsx\" (UID: \"e1213b9a-e51e-4af9-835b-a39b5378ed60\") " pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.347563 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1213b9a-e51e-4af9-835b-a39b5378ed60-utilities\") pod \"redhat-marketplace-kljsx\" (UID: \"e1213b9a-e51e-4af9-835b-a39b5378ed60\") " pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.348625 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1213b9a-e51e-4af9-835b-a39b5378ed60-catalog-content\") pod \"redhat-marketplace-kljsx\" (UID: \"e1213b9a-e51e-4af9-835b-a39b5378ed60\") " pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.370551 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfvc7\" (UniqueName: \"kubernetes.io/projected/e1213b9a-e51e-4af9-835b-a39b5378ed60-kube-api-access-sfvc7\") pod \"redhat-marketplace-kljsx\" (UID: \"e1213b9a-e51e-4af9-835b-a39b5378ed60\") " pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.435043 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.512282 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pmr8p"] Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.513752 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.517678 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmr8p"] Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.575555 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" event={"ID":"15ca5de7-b5aa-4d86-82b5-122f22b494ee","Type":"ContainerStarted","Data":"391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6"} Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.576031 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" event={"ID":"15ca5de7-b5aa-4d86-82b5-122f22b494ee","Type":"ContainerStarted","Data":"197680c3c569be03412cf4593f84d7170d199e46ce7d6517d9503cabd51933eb"} Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.576049 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.580911 4964 generic.go:334] "Generic (PLEG): container finished" podID="b21a779b-5ac3-45ab-bb35-cd476548696b" containerID="e3a49fdc8ace8eef9260b8d2129776548d8a1200accf99c04b13a3af969fc165" exitCode=0 Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.581064 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b21a779b-5ac3-45ab-bb35-cd476548696b","Type":"ContainerDied","Data":"e3a49fdc8ace8eef9260b8d2129776548d8a1200accf99c04b13a3af969fc165"} Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.605424 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" podStartSLOduration=130.605406236 podStartE2EDuration="2m10.605406236s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:42:51.601929153 +0000 UTC m=+151.498887791" watchObservedRunningTime="2025-10-04 02:42:51.605406236 +0000 UTC m=+151.502364874" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.651378 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c828d956-23f7-4720-9c34-63d6a33833b3-utilities\") pod \"redhat-marketplace-pmr8p\" (UID: \"c828d956-23f7-4720-9c34-63d6a33833b3\") " pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.651430 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww94v\" (UniqueName: \"kubernetes.io/projected/c828d956-23f7-4720-9c34-63d6a33833b3-kube-api-access-ww94v\") pod \"redhat-marketplace-pmr8p\" (UID: \"c828d956-23f7-4720-9c34-63d6a33833b3\") " pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.651528 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c828d956-23f7-4720-9c34-63d6a33833b3-catalog-content\") pod \"redhat-marketplace-pmr8p\" (UID: \"c828d956-23f7-4720-9c34-63d6a33833b3\") " pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.693141 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kljsx"] Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.752835 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c828d956-23f7-4720-9c34-63d6a33833b3-utilities\") pod \"redhat-marketplace-pmr8p\" (UID: \"c828d956-23f7-4720-9c34-63d6a33833b3\") " pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.752900 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww94v\" (UniqueName: \"kubernetes.io/projected/c828d956-23f7-4720-9c34-63d6a33833b3-kube-api-access-ww94v\") pod \"redhat-marketplace-pmr8p\" (UID: \"c828d956-23f7-4720-9c34-63d6a33833b3\") " pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.752985 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c828d956-23f7-4720-9c34-63d6a33833b3-catalog-content\") pod \"redhat-marketplace-pmr8p\" (UID: \"c828d956-23f7-4720-9c34-63d6a33833b3\") " pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.753751 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c828d956-23f7-4720-9c34-63d6a33833b3-catalog-content\") pod \"redhat-marketplace-pmr8p\" (UID: \"c828d956-23f7-4720-9c34-63d6a33833b3\") " pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.753777 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c828d956-23f7-4720-9c34-63d6a33833b3-utilities\") pod \"redhat-marketplace-pmr8p\" (UID: \"c828d956-23f7-4720-9c34-63d6a33833b3\") " pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.771102 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww94v\" (UniqueName: \"kubernetes.io/projected/c828d956-23f7-4720-9c34-63d6a33833b3-kube-api-access-ww94v\") pod \"redhat-marketplace-pmr8p\" (UID: \"c828d956-23f7-4720-9c34-63d6a33833b3\") " pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:42:51 crc kubenswrapper[4964]: I1004 02:42:51.836367 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:42:52 crc kubenswrapper[4964]: E1004 02:42:52.042090 4964 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1213b9a_e51e_4af9_835b_a39b5378ed60.slice/crio-4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1213b9a_e51e_4af9_835b_a39b5378ed60.slice/crio-conmon-4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6.scope\": RecentStats: unable to find data in memory cache]" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.048239 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:52 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:52 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:52 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.048275 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.069907 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.087035 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffdcs" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.100849 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.100891 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.111308 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6gfn4"] Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.112882 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.120591 4964 patch_prober.go:28] interesting pod/console-f9d7485db-cznct container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.120655 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-cznct" podUID="d30aed64-b8f7-4028-8dfc-f3661ce1c459" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.120968 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.138970 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6gfn4"] Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.147335 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmr8p"] Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.156991 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-utilities\") pod \"redhat-operators-6gfn4\" (UID: \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\") " pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.157027 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjr4b\" (UniqueName: \"kubernetes.io/projected/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-kube-api-access-rjr4b\") pod \"redhat-operators-6gfn4\" (UID: \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\") " pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.157097 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-catalog-content\") pod \"redhat-operators-6gfn4\" (UID: \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\") " pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:42:52 crc kubenswrapper[4964]: W1004 02:42:52.157235 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc828d956_23f7_4720_9c34_63d6a33833b3.slice/crio-73546740e00b97d4d7c13c812da3860ee399f9bd5335708ff2f2760ebfba6bd4 WatchSource:0}: Error finding container 73546740e00b97d4d7c13c812da3860ee399f9bd5335708ff2f2760ebfba6bd4: Status 404 returned error can't find the container with id 73546740e00b97d4d7c13c812da3860ee399f9bd5335708ff2f2760ebfba6bd4 Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.262752 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-utilities\") pod \"redhat-operators-6gfn4\" (UID: \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\") " pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.262798 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjr4b\" (UniqueName: \"kubernetes.io/projected/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-kube-api-access-rjr4b\") pod \"redhat-operators-6gfn4\" (UID: \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\") " pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.262841 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-catalog-content\") pod \"redhat-operators-6gfn4\" (UID: \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\") " pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.263275 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-catalog-content\") pod \"redhat-operators-6gfn4\" (UID: \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\") " pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.263452 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-utilities\") pod \"redhat-operators-6gfn4\" (UID: \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\") " pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.280551 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjr4b\" (UniqueName: \"kubernetes.io/projected/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-kube-api-access-rjr4b\") pod \"redhat-operators-6gfn4\" (UID: \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\") " pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.395150 4964 patch_prober.go:28] interesting pod/downloads-7954f5f757-xgvpt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.395439 4964 patch_prober.go:28] interesting pod/downloads-7954f5f757-xgvpt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.395476 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xgvpt" podUID="2e961f5c-6d04-462b-9097-346efdfe347c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.395498 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xgvpt" podUID="2e961f5c-6d04-462b-9097-346efdfe347c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.435143 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-4zll7" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.448866 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.518930 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c8548"] Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.523414 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.536406 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c8548"] Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.545306 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jbtzt" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.572606 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d02de6-aa81-4246-bcea-838ed9fe84ed-catalog-content\") pod \"redhat-operators-c8548\" (UID: \"52d02de6-aa81-4246-bcea-838ed9fe84ed\") " pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.572723 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d02de6-aa81-4246-bcea-838ed9fe84ed-utilities\") pod \"redhat-operators-c8548\" (UID: \"52d02de6-aa81-4246-bcea-838ed9fe84ed\") " pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.572796 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfxwg\" (UniqueName: \"kubernetes.io/projected/52d02de6-aa81-4246-bcea-838ed9fe84ed-kube-api-access-bfxwg\") pod \"redhat-operators-c8548\" (UID: \"52d02de6-aa81-4246-bcea-838ed9fe84ed\") " pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.587309 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.636388 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.637123 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.642727 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.643323 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.643674 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmr8p" event={"ID":"c828d956-23f7-4720-9c34-63d6a33833b3","Type":"ContainerDied","Data":"a9f1b0b776cda5eb4f128f6c297590b870730dd35af00d281015b1539f841f46"} Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.643192 4964 generic.go:334] "Generic (PLEG): container finished" podID="c828d956-23f7-4720-9c34-63d6a33833b3" containerID="a9f1b0b776cda5eb4f128f6c297590b870730dd35af00d281015b1539f841f46" exitCode=0 Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.643888 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmr8p" event={"ID":"c828d956-23f7-4720-9c34-63d6a33833b3","Type":"ContainerStarted","Data":"73546740e00b97d4d7c13c812da3860ee399f9bd5335708ff2f2760ebfba6bd4"} Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.651675 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.662030 4964 generic.go:334] "Generic (PLEG): container finished" podID="e1213b9a-e51e-4af9-835b-a39b5378ed60" containerID="4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6" exitCode=0 Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.662575 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kljsx" event={"ID":"e1213b9a-e51e-4af9-835b-a39b5378ed60","Type":"ContainerDied","Data":"4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6"} Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.662642 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kljsx" event={"ID":"e1213b9a-e51e-4af9-835b-a39b5378ed60","Type":"ContainerStarted","Data":"c53288fe0d2d27fae5a344e21331e973587c1ea2211878d88fb0c8364f4569f1"} Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.674235 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfxwg\" (UniqueName: \"kubernetes.io/projected/52d02de6-aa81-4246-bcea-838ed9fe84ed-kube-api-access-bfxwg\") pod \"redhat-operators-c8548\" (UID: \"52d02de6-aa81-4246-bcea-838ed9fe84ed\") " pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.674311 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/772115d2-463f-4fba-a7c8-05aa4e460a22-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"772115d2-463f-4fba-a7c8-05aa4e460a22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.674407 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d02de6-aa81-4246-bcea-838ed9fe84ed-catalog-content\") pod \"redhat-operators-c8548\" (UID: \"52d02de6-aa81-4246-bcea-838ed9fe84ed\") " pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.674466 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/772115d2-463f-4fba-a7c8-05aa4e460a22-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"772115d2-463f-4fba-a7c8-05aa4e460a22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.674489 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d02de6-aa81-4246-bcea-838ed9fe84ed-utilities\") pod \"redhat-operators-c8548\" (UID: \"52d02de6-aa81-4246-bcea-838ed9fe84ed\") " pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.675757 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d02de6-aa81-4246-bcea-838ed9fe84ed-catalog-content\") pod \"redhat-operators-c8548\" (UID: \"52d02de6-aa81-4246-bcea-838ed9fe84ed\") " pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.679650 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d02de6-aa81-4246-bcea-838ed9fe84ed-utilities\") pod \"redhat-operators-c8548\" (UID: \"52d02de6-aa81-4246-bcea-838ed9fe84ed\") " pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.697044 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfxwg\" (UniqueName: \"kubernetes.io/projected/52d02de6-aa81-4246-bcea-838ed9fe84ed-kube-api-access-bfxwg\") pod \"redhat-operators-c8548\" (UID: \"52d02de6-aa81-4246-bcea-838ed9fe84ed\") " pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.778544 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/772115d2-463f-4fba-a7c8-05aa4e460a22-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"772115d2-463f-4fba-a7c8-05aa4e460a22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.778757 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/772115d2-463f-4fba-a7c8-05aa4e460a22-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"772115d2-463f-4fba-a7c8-05aa4e460a22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.779120 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/772115d2-463f-4fba-a7c8-05aa4e460a22-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"772115d2-463f-4fba-a7c8-05aa4e460a22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.815954 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/772115d2-463f-4fba-a7c8-05aa4e460a22-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"772115d2-463f-4fba-a7c8-05aa4e460a22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.860469 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.939087 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.953023 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.993377 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b21a779b-5ac3-45ab-bb35-cd476548696b-kubelet-dir\") pod \"b21a779b-5ac3-45ab-bb35-cd476548696b\" (UID: \"b21a779b-5ac3-45ab-bb35-cd476548696b\") " Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.993433 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b21a779b-5ac3-45ab-bb35-cd476548696b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b21a779b-5ac3-45ab-bb35-cd476548696b" (UID: "b21a779b-5ac3-45ab-bb35-cd476548696b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.993495 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b21a779b-5ac3-45ab-bb35-cd476548696b-kube-api-access\") pod \"b21a779b-5ac3-45ab-bb35-cd476548696b\" (UID: \"b21a779b-5ac3-45ab-bb35-cd476548696b\") " Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.993750 4964 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b21a779b-5ac3-45ab-bb35-cd476548696b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 04 02:42:52 crc kubenswrapper[4964]: I1004 02:42:52.998166 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b21a779b-5ac3-45ab-bb35-cd476548696b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b21a779b-5ac3-45ab-bb35-cd476548696b" (UID: "b21a779b-5ac3-45ab-bb35-cd476548696b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.018531 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6gfn4"] Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.046128 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.048529 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:53 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:53 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:53 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.048586 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.095441 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b21a779b-5ac3-45ab-bb35-cd476548696b-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.146259 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c8548"] Oct 04 02:42:53 crc kubenswrapper[4964]: W1004 02:42:53.156997 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52d02de6_aa81_4246_bcea_838ed9fe84ed.slice/crio-6d857278b3b3c9b4dcf60dff125012e6b10a2897cec612d823c29fe290a27dbf WatchSource:0}: Error finding container 6d857278b3b3c9b4dcf60dff125012e6b10a2897cec612d823c29fe290a27dbf: Status 404 returned error can't find the container with id 6d857278b3b3c9b4dcf60dff125012e6b10a2897cec612d823c29fe290a27dbf Oct 04 02:42:53 crc kubenswrapper[4964]: W1004 02:42:53.452651 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod772115d2_463f_4fba_a7c8_05aa4e460a22.slice/crio-ee4fd5ea17ed55400133af1f1af911023cca3e3f351ad4e3ee6e24e1c1be8f0f WatchSource:0}: Error finding container ee4fd5ea17ed55400133af1f1af911023cca3e3f351ad4e3ee6e24e1c1be8f0f: Status 404 returned error can't find the container with id ee4fd5ea17ed55400133af1f1af911023cca3e3f351ad4e3ee6e24e1c1be8f0f Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.454859 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.689226 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.689221 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b21a779b-5ac3-45ab-bb35-cd476548696b","Type":"ContainerDied","Data":"f0bdd515e1ac8facf39761905b631860958c113403c22f3e5a869fd16185aacf"} Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.689363 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0bdd515e1ac8facf39761905b631860958c113403c22f3e5a869fd16185aacf" Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.692678 4964 generic.go:334] "Generic (PLEG): container finished" podID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" containerID="1858754f8ac97f2fe9a40b1e13a620dd7234ee0f87d7358db1112127779efb42" exitCode=0 Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.692765 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gfn4" event={"ID":"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6","Type":"ContainerDied","Data":"1858754f8ac97f2fe9a40b1e13a620dd7234ee0f87d7358db1112127779efb42"} Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.693005 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gfn4" event={"ID":"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6","Type":"ContainerStarted","Data":"efeb7227e7895f9f4f79d291b2395e9dc57631717b04aaca245da3a6b98edb39"} Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.696166 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"772115d2-463f-4fba-a7c8-05aa4e460a22","Type":"ContainerStarted","Data":"ee4fd5ea17ed55400133af1f1af911023cca3e3f351ad4e3ee6e24e1c1be8f0f"} Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.699110 4964 generic.go:334] "Generic (PLEG): container finished" podID="52d02de6-aa81-4246-bcea-838ed9fe84ed" containerID="9a24fc1e64d51fb059eaa13188c8e26dcba190efe946971ef0bc87cb606c9a0a" exitCode=0 Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.699368 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c8548" event={"ID":"52d02de6-aa81-4246-bcea-838ed9fe84ed","Type":"ContainerDied","Data":"9a24fc1e64d51fb059eaa13188c8e26dcba190efe946971ef0bc87cb606c9a0a"} Oct 04 02:42:53 crc kubenswrapper[4964]: I1004 02:42:53.699391 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c8548" event={"ID":"52d02de6-aa81-4246-bcea-838ed9fe84ed","Type":"ContainerStarted","Data":"6d857278b3b3c9b4dcf60dff125012e6b10a2897cec612d823c29fe290a27dbf"} Oct 04 02:42:54 crc kubenswrapper[4964]: I1004 02:42:54.055151 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:54 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:54 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:54 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:54 crc kubenswrapper[4964]: I1004 02:42:54.055213 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:54 crc kubenswrapper[4964]: I1004 02:42:54.716326 4964 generic.go:334] "Generic (PLEG): container finished" podID="772115d2-463f-4fba-a7c8-05aa4e460a22" containerID="b490bdfb40a89d7eec3184d17c8ae5b9d9935e9d5e2ac9c3780603518b46329c" exitCode=0 Oct 04 02:42:54 crc kubenswrapper[4964]: I1004 02:42:54.716371 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"772115d2-463f-4fba-a7c8-05aa4e460a22","Type":"ContainerDied","Data":"b490bdfb40a89d7eec3184d17c8ae5b9d9935e9d5e2ac9c3780603518b46329c"} Oct 04 02:42:55 crc kubenswrapper[4964]: I1004 02:42:55.047887 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:55 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:55 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:55 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:55 crc kubenswrapper[4964]: I1004 02:42:55.047960 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:55 crc kubenswrapper[4964]: I1004 02:42:55.057050 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:55 crc kubenswrapper[4964]: I1004 02:42:55.057397 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-flq6t" Oct 04 02:42:55 crc kubenswrapper[4964]: I1004 02:42:55.061770 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-8dqhm" Oct 04 02:42:56 crc kubenswrapper[4964]: I1004 02:42:56.047453 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:56 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:56 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:56 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:56 crc kubenswrapper[4964]: I1004 02:42:56.047683 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:57 crc kubenswrapper[4964]: I1004 02:42:57.051690 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:57 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:57 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:57 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:57 crc kubenswrapper[4964]: I1004 02:42:57.051966 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:58 crc kubenswrapper[4964]: I1004 02:42:58.048214 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:58 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:58 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:58 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:58 crc kubenswrapper[4964]: I1004 02:42:58.048270 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:42:59 crc kubenswrapper[4964]: I1004 02:42:59.047940 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:42:59 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:42:59 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:42:59 crc kubenswrapper[4964]: healthz check failed Oct 04 02:42:59 crc kubenswrapper[4964]: I1004 02:42:59.047992 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:43:00 crc kubenswrapper[4964]: I1004 02:43:00.048366 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:43:00 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:43:00 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:43:00 crc kubenswrapper[4964]: healthz check failed Oct 04 02:43:00 crc kubenswrapper[4964]: I1004 02:43:00.048456 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:43:01 crc kubenswrapper[4964]: I1004 02:43:01.050260 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:43:01 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:43:01 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:43:01 crc kubenswrapper[4964]: healthz check failed Oct 04 02:43:01 crc kubenswrapper[4964]: I1004 02:43:01.050574 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.047529 4964 patch_prober.go:28] interesting pod/router-default-5444994796-khlb2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 02:43:02 crc kubenswrapper[4964]: [-]has-synced failed: reason withheld Oct 04 02:43:02 crc kubenswrapper[4964]: [+]process-running ok Oct 04 02:43:02 crc kubenswrapper[4964]: healthz check failed Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.047597 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-khlb2" podUID="c1430fe6-c0d6-4356-8aa3-b3c06f738c2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.101053 4964 patch_prober.go:28] interesting pod/console-f9d7485db-cznct container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.101131 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-cznct" podUID="d30aed64-b8f7-4028-8dfc-f3661ce1c459" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.413582 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-xgvpt" Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.784945 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.797737 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"772115d2-463f-4fba-a7c8-05aa4e460a22","Type":"ContainerDied","Data":"ee4fd5ea17ed55400133af1f1af911023cca3e3f351ad4e3ee6e24e1c1be8f0f"} Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.797790 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee4fd5ea17ed55400133af1f1af911023cca3e3f351ad4e3ee6e24e1c1be8f0f" Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.797851 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.866496 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/772115d2-463f-4fba-a7c8-05aa4e460a22-kube-api-access\") pod \"772115d2-463f-4fba-a7c8-05aa4e460a22\" (UID: \"772115d2-463f-4fba-a7c8-05aa4e460a22\") " Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.866664 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/772115d2-463f-4fba-a7c8-05aa4e460a22-kubelet-dir\") pod \"772115d2-463f-4fba-a7c8-05aa4e460a22\" (UID: \"772115d2-463f-4fba-a7c8-05aa4e460a22\") " Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.866958 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/772115d2-463f-4fba-a7c8-05aa4e460a22-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "772115d2-463f-4fba-a7c8-05aa4e460a22" (UID: "772115d2-463f-4fba-a7c8-05aa4e460a22"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.873226 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772115d2-463f-4fba-a7c8-05aa4e460a22-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "772115d2-463f-4fba-a7c8-05aa4e460a22" (UID: "772115d2-463f-4fba-a7c8-05aa4e460a22"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.968805 4964 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/772115d2-463f-4fba-a7c8-05aa4e460a22-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 04 02:43:02 crc kubenswrapper[4964]: I1004 02:43:02.968841 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/772115d2-463f-4fba-a7c8-05aa4e460a22-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 04 02:43:03 crc kubenswrapper[4964]: I1004 02:43:03.068467 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:43:03 crc kubenswrapper[4964]: I1004 02:43:03.073327 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-khlb2" Oct 04 02:43:04 crc kubenswrapper[4964]: I1004 02:43:04.449553 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:43:04 crc kubenswrapper[4964]: I1004 02:43:04.450117 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:43:04 crc kubenswrapper[4964]: I1004 02:43:04.897817 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:43:04 crc kubenswrapper[4964]: I1004 02:43:04.909201 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1c9150-b444-41bb-9233-d76c4765a2d0-metrics-certs\") pod \"network-metrics-daemon-xrr6r\" (UID: \"7f1c9150-b444-41bb-9233-d76c4765a2d0\") " pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:43:05 crc kubenswrapper[4964]: I1004 02:43:05.179697 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrr6r" Oct 04 02:43:10 crc kubenswrapper[4964]: I1004 02:43:10.546427 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:43:12 crc kubenswrapper[4964]: I1004 02:43:12.105998 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:43:12 crc kubenswrapper[4964]: I1004 02:43:12.110960 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:43:22 crc kubenswrapper[4964]: I1004 02:43:22.551363 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l4jsz" Oct 04 02:43:28 crc kubenswrapper[4964]: E1004 02:43:28.668981 4964 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 04 02:43:28 crc kubenswrapper[4964]: E1004 02:43:28.669718 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j4rvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xzv28_openshift-marketplace(b39c2152-b733-43f5-acd7-75e8948518f0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 02:43:28 crc kubenswrapper[4964]: E1004 02:43:28.674736 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-xzv28" podUID="b39c2152-b733-43f5-acd7-75e8948518f0" Oct 04 02:43:28 crc kubenswrapper[4964]: I1004 02:43:28.888724 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 02:43:29 crc kubenswrapper[4964]: E1004 02:43:29.839707 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xzv28" podUID="b39c2152-b733-43f5-acd7-75e8948518f0" Oct 04 02:43:29 crc kubenswrapper[4964]: E1004 02:43:29.917361 4964 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 04 02:43:29 crc kubenswrapper[4964]: E1004 02:43:29.917869 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78pn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9gg8f_openshift-marketplace(698d3183-93e5-4693-8b24-cc507a41d274): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 02:43:29 crc kubenswrapper[4964]: E1004 02:43:29.919028 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9gg8f" podUID="698d3183-93e5-4693-8b24-cc507a41d274" Oct 04 02:43:29 crc kubenswrapper[4964]: E1004 02:43:29.950605 4964 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 04 02:43:29 crc kubenswrapper[4964]: E1004 02:43:29.950762 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ztbpw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8k555_openshift-marketplace(29d99f96-8f30-452b-9ca2-1f0c640380f8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 02:43:29 crc kubenswrapper[4964]: E1004 02:43:29.953288 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8k555" podUID="29d99f96-8f30-452b-9ca2-1f0c640380f8" Oct 04 02:43:29 crc kubenswrapper[4964]: E1004 02:43:29.963932 4964 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 04 02:43:29 crc kubenswrapper[4964]: E1004 02:43:29.964067 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v7cxd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hw9j9_openshift-marketplace(a6858848-2cfa-4910-9078-6d94d3e875d5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 02:43:29 crc kubenswrapper[4964]: E1004 02:43:29.965332 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hw9j9" podUID="a6858848-2cfa-4910-9078-6d94d3e875d5" Oct 04 02:43:30 crc kubenswrapper[4964]: E1004 02:43:30.551936 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8k555" podUID="29d99f96-8f30-452b-9ca2-1f0c640380f8" Oct 04 02:43:30 crc kubenswrapper[4964]: E1004 02:43:30.552273 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9gg8f" podUID="698d3183-93e5-4693-8b24-cc507a41d274" Oct 04 02:43:30 crc kubenswrapper[4964]: E1004 02:43:30.553041 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hw9j9" podUID="a6858848-2cfa-4910-9078-6d94d3e875d5" Oct 04 02:43:30 crc kubenswrapper[4964]: E1004 02:43:30.608197 4964 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 04 02:43:30 crc kubenswrapper[4964]: E1004 02:43:30.608367 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sfvc7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kljsx_openshift-marketplace(e1213b9a-e51e-4af9-835b-a39b5378ed60): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 02:43:30 crc kubenswrapper[4964]: E1004 02:43:30.609541 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kljsx" podUID="e1213b9a-e51e-4af9-835b-a39b5378ed60" Oct 04 02:43:30 crc kubenswrapper[4964]: E1004 02:43:30.644293 4964 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 04 02:43:30 crc kubenswrapper[4964]: E1004 02:43:30.644479 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ww94v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-pmr8p_openshift-marketplace(c828d956-23f7-4720-9c34-63d6a33833b3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 02:43:30 crc kubenswrapper[4964]: E1004 02:43:30.646425 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-pmr8p" podUID="c828d956-23f7-4720-9c34-63d6a33833b3" Oct 04 02:43:33 crc kubenswrapper[4964]: E1004 02:43:33.474525 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kljsx" podUID="e1213b9a-e51e-4af9-835b-a39b5378ed60" Oct 04 02:43:33 crc kubenswrapper[4964]: E1004 02:43:33.476239 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-pmr8p" podUID="c828d956-23f7-4720-9c34-63d6a33833b3" Oct 04 02:43:33 crc kubenswrapper[4964]: E1004 02:43:33.497690 4964 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 04 02:43:33 crc kubenswrapper[4964]: E1004 02:43:33.497818 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bfxwg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-c8548_openshift-marketplace(52d02de6-aa81-4246-bcea-838ed9fe84ed): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 02:43:33 crc kubenswrapper[4964]: E1004 02:43:33.499378 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-c8548" podUID="52d02de6-aa81-4246-bcea-838ed9fe84ed" Oct 04 02:43:33 crc kubenswrapper[4964]: I1004 02:43:33.929322 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xrr6r"] Oct 04 02:43:33 crc kubenswrapper[4964]: I1004 02:43:33.999247 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" event={"ID":"7f1c9150-b444-41bb-9233-d76c4765a2d0","Type":"ContainerStarted","Data":"ab525e8de44bfb3780eb97f912f9949beb28aa78657d75af56b045229b74f54a"} Oct 04 02:43:34 crc kubenswrapper[4964]: E1004 02:43:34.011220 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-c8548" podUID="52d02de6-aa81-4246-bcea-838ed9fe84ed" Oct 04 02:43:34 crc kubenswrapper[4964]: I1004 02:43:34.449607 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:43:34 crc kubenswrapper[4964]: I1004 02:43:34.449962 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:43:35 crc kubenswrapper[4964]: I1004 02:43:35.017251 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" event={"ID":"7f1c9150-b444-41bb-9233-d76c4765a2d0","Type":"ContainerStarted","Data":"7e8adce556c513ace0112cfe1ab97153ad345cf32846f786f436feaa3cb66e67"} Oct 04 02:43:35 crc kubenswrapper[4964]: I1004 02:43:35.017300 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xrr6r" event={"ID":"7f1c9150-b444-41bb-9233-d76c4765a2d0","Type":"ContainerStarted","Data":"2935b6dec57c37ac297d3569b86a7ad9a760b314b59ecdaeadcaa07b328806df"} Oct 04 02:43:35 crc kubenswrapper[4964]: I1004 02:43:35.022340 4964 generic.go:334] "Generic (PLEG): container finished" podID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" containerID="5e0fee700df337b740a0a7b62a7b6b1c8c3d05bf2383e02be7b7457c2f6a0332" exitCode=0 Oct 04 02:43:35 crc kubenswrapper[4964]: I1004 02:43:35.022412 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gfn4" event={"ID":"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6","Type":"ContainerDied","Data":"5e0fee700df337b740a0a7b62a7b6b1c8c3d05bf2383e02be7b7457c2f6a0332"} Oct 04 02:43:35 crc kubenswrapper[4964]: I1004 02:43:35.046494 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xrr6r" podStartSLOduration=174.046432589 podStartE2EDuration="2m54.046432589s" podCreationTimestamp="2025-10-04 02:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:43:35.039501585 +0000 UTC m=+194.936460233" watchObservedRunningTime="2025-10-04 02:43:35.046432589 +0000 UTC m=+194.943391267" Oct 04 02:43:38 crc kubenswrapper[4964]: I1004 02:43:38.041256 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gfn4" event={"ID":"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6","Type":"ContainerStarted","Data":"c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0"} Oct 04 02:43:38 crc kubenswrapper[4964]: I1004 02:43:38.068201 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6gfn4" podStartSLOduration=2.762187251 podStartE2EDuration="46.068178168s" podCreationTimestamp="2025-10-04 02:42:52 +0000 UTC" firstStartedPulling="2025-10-04 02:42:53.714324795 +0000 UTC m=+153.611283433" lastFinishedPulling="2025-10-04 02:43:37.020315692 +0000 UTC m=+196.917274350" observedRunningTime="2025-10-04 02:43:38.065683731 +0000 UTC m=+197.962642369" watchObservedRunningTime="2025-10-04 02:43:38.068178168 +0000 UTC m=+197.965136836" Oct 04 02:43:42 crc kubenswrapper[4964]: I1004 02:43:42.449739 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:43:42 crc kubenswrapper[4964]: I1004 02:43:42.450125 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:43:43 crc kubenswrapper[4964]: I1004 02:43:43.770332 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6gfn4" podUID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" containerName="registry-server" probeResult="failure" output=< Oct 04 02:43:43 crc kubenswrapper[4964]: timeout: failed to connect service ":50051" within 1s Oct 04 02:43:43 crc kubenswrapper[4964]: > Oct 04 02:43:52 crc kubenswrapper[4964]: I1004 02:43:52.536836 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:43:52 crc kubenswrapper[4964]: I1004 02:43:52.609009 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:43:57 crc kubenswrapper[4964]: I1004 02:43:57.187402 4964 generic.go:334] "Generic (PLEG): container finished" podID="e1213b9a-e51e-4af9-835b-a39b5378ed60" containerID="2381dbca20bd7c5d44b8656328c0ceb912e8e8382413f5aee757e82c541090b1" exitCode=0 Oct 04 02:43:57 crc kubenswrapper[4964]: I1004 02:43:57.187497 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kljsx" event={"ID":"e1213b9a-e51e-4af9-835b-a39b5378ed60","Type":"ContainerDied","Data":"2381dbca20bd7c5d44b8656328c0ceb912e8e8382413f5aee757e82c541090b1"} Oct 04 02:43:57 crc kubenswrapper[4964]: I1004 02:43:57.194551 4964 generic.go:334] "Generic (PLEG): container finished" podID="b39c2152-b733-43f5-acd7-75e8948518f0" containerID="458e5627b3ac90dc60863823a3442f587e1e5e54880fe3ec03f7dc7ab1cae5db" exitCode=0 Oct 04 02:43:57 crc kubenswrapper[4964]: I1004 02:43:57.194662 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xzv28" event={"ID":"b39c2152-b733-43f5-acd7-75e8948518f0","Type":"ContainerDied","Data":"458e5627b3ac90dc60863823a3442f587e1e5e54880fe3ec03f7dc7ab1cae5db"} Oct 04 02:43:57 crc kubenswrapper[4964]: I1004 02:43:57.197273 4964 generic.go:334] "Generic (PLEG): container finished" podID="29d99f96-8f30-452b-9ca2-1f0c640380f8" containerID="f94770f5f14ec4b717bc20f26535cbf4ec7aea38ac224bcc652d2f5766505e87" exitCode=0 Oct 04 02:43:57 crc kubenswrapper[4964]: I1004 02:43:57.197326 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k555" event={"ID":"29d99f96-8f30-452b-9ca2-1f0c640380f8","Type":"ContainerDied","Data":"f94770f5f14ec4b717bc20f26535cbf4ec7aea38ac224bcc652d2f5766505e87"} Oct 04 02:43:57 crc kubenswrapper[4964]: I1004 02:43:57.202451 4964 generic.go:334] "Generic (PLEG): container finished" podID="c828d956-23f7-4720-9c34-63d6a33833b3" containerID="643178d8f6aaf4ae3271dc57fba6e1c91f20701537ddf729cd1c20ca79ee1b53" exitCode=0 Oct 04 02:43:57 crc kubenswrapper[4964]: I1004 02:43:57.202567 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmr8p" event={"ID":"c828d956-23f7-4720-9c34-63d6a33833b3","Type":"ContainerDied","Data":"643178d8f6aaf4ae3271dc57fba6e1c91f20701537ddf729cd1c20ca79ee1b53"} Oct 04 02:43:57 crc kubenswrapper[4964]: I1004 02:43:57.210235 4964 generic.go:334] "Generic (PLEG): container finished" podID="698d3183-93e5-4693-8b24-cc507a41d274" containerID="9ea4d624e17d3389fa92665d77e191691dd977afc9b589c962df3c757048e29f" exitCode=0 Oct 04 02:43:57 crc kubenswrapper[4964]: I1004 02:43:57.210281 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gg8f" event={"ID":"698d3183-93e5-4693-8b24-cc507a41d274","Type":"ContainerDied","Data":"9ea4d624e17d3389fa92665d77e191691dd977afc9b589c962df3c757048e29f"} Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.217446 4964 generic.go:334] "Generic (PLEG): container finished" podID="a6858848-2cfa-4910-9078-6d94d3e875d5" containerID="b0cebf1f0dbb5451596bb37a177be43ffaf12990c4b0c9bdf113a5e18dc8964c" exitCode=0 Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.217528 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw9j9" event={"ID":"a6858848-2cfa-4910-9078-6d94d3e875d5","Type":"ContainerDied","Data":"b0cebf1f0dbb5451596bb37a177be43ffaf12990c4b0c9bdf113a5e18dc8964c"} Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.220539 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmr8p" event={"ID":"c828d956-23f7-4720-9c34-63d6a33833b3","Type":"ContainerStarted","Data":"d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a"} Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.235247 4964 generic.go:334] "Generic (PLEG): container finished" podID="52d02de6-aa81-4246-bcea-838ed9fe84ed" containerID="3708e31dbcf25777399b6898bafb19a06b51f4c543676a3d0ebe07b5e1059e7f" exitCode=0 Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.235388 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c8548" event={"ID":"52d02de6-aa81-4246-bcea-838ed9fe84ed","Type":"ContainerDied","Data":"3708e31dbcf25777399b6898bafb19a06b51f4c543676a3d0ebe07b5e1059e7f"} Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.239904 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gg8f" event={"ID":"698d3183-93e5-4693-8b24-cc507a41d274","Type":"ContainerStarted","Data":"ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396"} Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.251006 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kljsx" event={"ID":"e1213b9a-e51e-4af9-835b-a39b5378ed60","Type":"ContainerStarted","Data":"647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753"} Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.252924 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xzv28" event={"ID":"b39c2152-b733-43f5-acd7-75e8948518f0","Type":"ContainerStarted","Data":"32bd3399c52b882c8bc07e007b111793c75b80a3a5fdbe7506ba521ffe353f74"} Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.254543 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k555" event={"ID":"29d99f96-8f30-452b-9ca2-1f0c640380f8","Type":"ContainerStarted","Data":"a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0"} Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.261448 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9gg8f" podStartSLOduration=2.028315775 podStartE2EDuration="1m9.26143414s" podCreationTimestamp="2025-10-04 02:42:49 +0000 UTC" firstStartedPulling="2025-10-04 02:42:50.54333949 +0000 UTC m=+150.440298128" lastFinishedPulling="2025-10-04 02:43:57.776457865 +0000 UTC m=+217.673416493" observedRunningTime="2025-10-04 02:43:58.260895036 +0000 UTC m=+218.157853674" watchObservedRunningTime="2025-10-04 02:43:58.26143414 +0000 UTC m=+218.158392778" Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.282320 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pmr8p" podStartSLOduration=2.133815318 podStartE2EDuration="1m7.282287846s" podCreationTimestamp="2025-10-04 02:42:51 +0000 UTC" firstStartedPulling="2025-10-04 02:42:52.653003374 +0000 UTC m=+152.549962012" lastFinishedPulling="2025-10-04 02:43:57.801475902 +0000 UTC m=+217.698434540" observedRunningTime="2025-10-04 02:43:58.278535497 +0000 UTC m=+218.175494135" watchObservedRunningTime="2025-10-04 02:43:58.282287846 +0000 UTC m=+218.179246474" Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.320354 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kljsx" podStartSLOduration=2.070742392 podStartE2EDuration="1m7.32034013s" podCreationTimestamp="2025-10-04 02:42:51 +0000 UTC" firstStartedPulling="2025-10-04 02:42:52.666356975 +0000 UTC m=+152.563315613" lastFinishedPulling="2025-10-04 02:43:57.915954713 +0000 UTC m=+217.812913351" observedRunningTime="2025-10-04 02:43:58.316991121 +0000 UTC m=+218.213949769" watchObservedRunningTime="2025-10-04 02:43:58.32034013 +0000 UTC m=+218.217298768" Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.341142 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xzv28" podStartSLOduration=1.929202788 podStartE2EDuration="1m9.341125964s" podCreationTimestamp="2025-10-04 02:42:49 +0000 UTC" firstStartedPulling="2025-10-04 02:42:50.545826971 +0000 UTC m=+150.442785609" lastFinishedPulling="2025-10-04 02:43:57.957750147 +0000 UTC m=+217.854708785" observedRunningTime="2025-10-04 02:43:58.337317673 +0000 UTC m=+218.234276311" watchObservedRunningTime="2025-10-04 02:43:58.341125964 +0000 UTC m=+218.238084602" Oct 04 02:43:58 crc kubenswrapper[4964]: I1004 02:43:58.361628 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8k555" podStartSLOduration=3.01641425 podStartE2EDuration="1m10.36160101s" podCreationTimestamp="2025-10-04 02:42:48 +0000 UTC" firstStartedPulling="2025-10-04 02:42:50.531384164 +0000 UTC m=+150.428342802" lastFinishedPulling="2025-10-04 02:43:57.876570924 +0000 UTC m=+217.773529562" observedRunningTime="2025-10-04 02:43:58.358580609 +0000 UTC m=+218.255539247" watchObservedRunningTime="2025-10-04 02:43:58.36160101 +0000 UTC m=+218.258559648" Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.259725 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw9j9" event={"ID":"a6858848-2cfa-4910-9078-6d94d3e875d5","Type":"ContainerStarted","Data":"d3961546d0e91dc48373cb07159d5858d0ae26bcaafdd4a4af2fef5dc6e032bd"} Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.260802 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.260952 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.263395 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c8548" event={"ID":"52d02de6-aa81-4246-bcea-838ed9fe84ed","Type":"ContainerStarted","Data":"6f4a9e266f8bc966c5c771326fdbf66ae0ff2dab0ded39872715fd26f2e9fa90"} Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.280708 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hw9j9" podStartSLOduration=2.142115264 podStartE2EDuration="1m10.280691383s" podCreationTimestamp="2025-10-04 02:42:49 +0000 UTC" firstStartedPulling="2025-10-04 02:42:50.537476671 +0000 UTC m=+150.434435309" lastFinishedPulling="2025-10-04 02:43:58.67605278 +0000 UTC m=+218.573011428" observedRunningTime="2025-10-04 02:43:59.278341551 +0000 UTC m=+219.175300199" watchObservedRunningTime="2025-10-04 02:43:59.280691383 +0000 UTC m=+219.177650011" Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.297923 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c8548" podStartSLOduration=2.373083335 podStartE2EDuration="1m7.297907902s" podCreationTimestamp="2025-10-04 02:42:52 +0000 UTC" firstStartedPulling="2025-10-04 02:42:53.700998255 +0000 UTC m=+153.597956893" lastFinishedPulling="2025-10-04 02:43:58.625822812 +0000 UTC m=+218.522781460" observedRunningTime="2025-10-04 02:43:59.297119391 +0000 UTC m=+219.194078029" watchObservedRunningTime="2025-10-04 02:43:59.297907902 +0000 UTC m=+219.194866530" Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.451260 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.452098 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.711541 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.711607 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.746444 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.882305 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:43:59 crc kubenswrapper[4964]: I1004 02:43:59.882362 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:44:00 crc kubenswrapper[4964]: I1004 02:44:00.031046 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zdrf"] Oct 04 02:44:00 crc kubenswrapper[4964]: I1004 02:44:00.300996 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-8k555" podUID="29d99f96-8f30-452b-9ca2-1f0c640380f8" containerName="registry-server" probeResult="failure" output=< Oct 04 02:44:00 crc kubenswrapper[4964]: timeout: failed to connect service ":50051" within 1s Oct 04 02:44:00 crc kubenswrapper[4964]: > Oct 04 02:44:00 crc kubenswrapper[4964]: I1004 02:44:00.494742 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9gg8f" podUID="698d3183-93e5-4693-8b24-cc507a41d274" containerName="registry-server" probeResult="failure" output=< Oct 04 02:44:00 crc kubenswrapper[4964]: timeout: failed to connect service ":50051" within 1s Oct 04 02:44:00 crc kubenswrapper[4964]: > Oct 04 02:44:00 crc kubenswrapper[4964]: I1004 02:44:00.915474 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hw9j9" podUID="a6858848-2cfa-4910-9078-6d94d3e875d5" containerName="registry-server" probeResult="failure" output=< Oct 04 02:44:00 crc kubenswrapper[4964]: timeout: failed to connect service ":50051" within 1s Oct 04 02:44:00 crc kubenswrapper[4964]: > Oct 04 02:44:01 crc kubenswrapper[4964]: I1004 02:44:01.435987 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:44:01 crc kubenswrapper[4964]: I1004 02:44:01.436268 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:44:01 crc kubenswrapper[4964]: I1004 02:44:01.480984 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:44:01 crc kubenswrapper[4964]: I1004 02:44:01.837056 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:44:01 crc kubenswrapper[4964]: I1004 02:44:01.837096 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:44:01 crc kubenswrapper[4964]: I1004 02:44:01.896063 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:44:02 crc kubenswrapper[4964]: I1004 02:44:02.339696 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:44:02 crc kubenswrapper[4964]: I1004 02:44:02.861171 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:44:02 crc kubenswrapper[4964]: I1004 02:44:02.861212 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:44:03 crc kubenswrapper[4964]: I1004 02:44:03.631184 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmr8p"] Oct 04 02:44:03 crc kubenswrapper[4964]: I1004 02:44:03.906661 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c8548" podUID="52d02de6-aa81-4246-bcea-838ed9fe84ed" containerName="registry-server" probeResult="failure" output=< Oct 04 02:44:03 crc kubenswrapper[4964]: timeout: failed to connect service ":50051" within 1s Oct 04 02:44:03 crc kubenswrapper[4964]: > Oct 04 02:44:04 crc kubenswrapper[4964]: I1004 02:44:04.449133 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:44:04 crc kubenswrapper[4964]: I1004 02:44:04.449212 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:44:04 crc kubenswrapper[4964]: I1004 02:44:04.449276 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:44:04 crc kubenswrapper[4964]: I1004 02:44:04.450122 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 02:44:04 crc kubenswrapper[4964]: I1004 02:44:04.450332 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387" gracePeriod=600 Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.327651 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387"} Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.329088 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387" exitCode=0 Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.329266 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"ca513d76044d84d58871dc80cf9d1dc2e3bdff83478b7916d3faa2f268b14909"} Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.329547 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pmr8p" podUID="c828d956-23f7-4720-9c34-63d6a33833b3" containerName="registry-server" containerID="cri-o://d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a" gracePeriod=2 Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.674566 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.702592 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c828d956-23f7-4720-9c34-63d6a33833b3-utilities\") pod \"c828d956-23f7-4720-9c34-63d6a33833b3\" (UID: \"c828d956-23f7-4720-9c34-63d6a33833b3\") " Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.702647 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c828d956-23f7-4720-9c34-63d6a33833b3-catalog-content\") pod \"c828d956-23f7-4720-9c34-63d6a33833b3\" (UID: \"c828d956-23f7-4720-9c34-63d6a33833b3\") " Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.702750 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww94v\" (UniqueName: \"kubernetes.io/projected/c828d956-23f7-4720-9c34-63d6a33833b3-kube-api-access-ww94v\") pod \"c828d956-23f7-4720-9c34-63d6a33833b3\" (UID: \"c828d956-23f7-4720-9c34-63d6a33833b3\") " Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.707981 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c828d956-23f7-4720-9c34-63d6a33833b3-utilities" (OuterVolumeSpecName: "utilities") pod "c828d956-23f7-4720-9c34-63d6a33833b3" (UID: "c828d956-23f7-4720-9c34-63d6a33833b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.709414 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c828d956-23f7-4720-9c34-63d6a33833b3-kube-api-access-ww94v" (OuterVolumeSpecName: "kube-api-access-ww94v") pod "c828d956-23f7-4720-9c34-63d6a33833b3" (UID: "c828d956-23f7-4720-9c34-63d6a33833b3"). InnerVolumeSpecName "kube-api-access-ww94v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.717220 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c828d956-23f7-4720-9c34-63d6a33833b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c828d956-23f7-4720-9c34-63d6a33833b3" (UID: "c828d956-23f7-4720-9c34-63d6a33833b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.804211 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww94v\" (UniqueName: \"kubernetes.io/projected/c828d956-23f7-4720-9c34-63d6a33833b3-kube-api-access-ww94v\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.804242 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c828d956-23f7-4720-9c34-63d6a33833b3-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:05 crc kubenswrapper[4964]: I1004 02:44:05.804254 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c828d956-23f7-4720-9c34-63d6a33833b3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.335921 4964 generic.go:334] "Generic (PLEG): container finished" podID="c828d956-23f7-4720-9c34-63d6a33833b3" containerID="d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a" exitCode=0 Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.336136 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmr8p" event={"ID":"c828d956-23f7-4720-9c34-63d6a33833b3","Type":"ContainerDied","Data":"d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a"} Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.336175 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmr8p" event={"ID":"c828d956-23f7-4720-9c34-63d6a33833b3","Type":"ContainerDied","Data":"73546740e00b97d4d7c13c812da3860ee399f9bd5335708ff2f2760ebfba6bd4"} Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.336196 4964 scope.go:117] "RemoveContainer" containerID="d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a" Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.336340 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmr8p" Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.356666 4964 scope.go:117] "RemoveContainer" containerID="643178d8f6aaf4ae3271dc57fba6e1c91f20701537ddf729cd1c20ca79ee1b53" Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.361140 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmr8p"] Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.364578 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmr8p"] Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.381309 4964 scope.go:117] "RemoveContainer" containerID="a9f1b0b776cda5eb4f128f6c297590b870730dd35af00d281015b1539f841f46" Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.398771 4964 scope.go:117] "RemoveContainer" containerID="d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a" Oct 04 02:44:06 crc kubenswrapper[4964]: E1004 02:44:06.399166 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a\": container with ID starting with d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a not found: ID does not exist" containerID="d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a" Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.399222 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a"} err="failed to get container status \"d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a\": rpc error: code = NotFound desc = could not find container \"d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a\": container with ID starting with d95909076f66835def982aae8d9bd42d3e72e546e7e50ea1aea476efdee8123a not found: ID does not exist" Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.399254 4964 scope.go:117] "RemoveContainer" containerID="643178d8f6aaf4ae3271dc57fba6e1c91f20701537ddf729cd1c20ca79ee1b53" Oct 04 02:44:06 crc kubenswrapper[4964]: E1004 02:44:06.399724 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"643178d8f6aaf4ae3271dc57fba6e1c91f20701537ddf729cd1c20ca79ee1b53\": container with ID starting with 643178d8f6aaf4ae3271dc57fba6e1c91f20701537ddf729cd1c20ca79ee1b53 not found: ID does not exist" containerID="643178d8f6aaf4ae3271dc57fba6e1c91f20701537ddf729cd1c20ca79ee1b53" Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.399899 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"643178d8f6aaf4ae3271dc57fba6e1c91f20701537ddf729cd1c20ca79ee1b53"} err="failed to get container status \"643178d8f6aaf4ae3271dc57fba6e1c91f20701537ddf729cd1c20ca79ee1b53\": rpc error: code = NotFound desc = could not find container \"643178d8f6aaf4ae3271dc57fba6e1c91f20701537ddf729cd1c20ca79ee1b53\": container with ID starting with 643178d8f6aaf4ae3271dc57fba6e1c91f20701537ddf729cd1c20ca79ee1b53 not found: ID does not exist" Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.399920 4964 scope.go:117] "RemoveContainer" containerID="a9f1b0b776cda5eb4f128f6c297590b870730dd35af00d281015b1539f841f46" Oct 04 02:44:06 crc kubenswrapper[4964]: E1004 02:44:06.400187 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f1b0b776cda5eb4f128f6c297590b870730dd35af00d281015b1539f841f46\": container with ID starting with a9f1b0b776cda5eb4f128f6c297590b870730dd35af00d281015b1539f841f46 not found: ID does not exist" containerID="a9f1b0b776cda5eb4f128f6c297590b870730dd35af00d281015b1539f841f46" Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.400204 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f1b0b776cda5eb4f128f6c297590b870730dd35af00d281015b1539f841f46"} err="failed to get container status \"a9f1b0b776cda5eb4f128f6c297590b870730dd35af00d281015b1539f841f46\": rpc error: code = NotFound desc = could not find container \"a9f1b0b776cda5eb4f128f6c297590b870730dd35af00d281015b1539f841f46\": container with ID starting with a9f1b0b776cda5eb4f128f6c297590b870730dd35af00d281015b1539f841f46 not found: ID does not exist" Oct 04 02:44:06 crc kubenswrapper[4964]: I1004 02:44:06.853388 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c828d956-23f7-4720-9c34-63d6a33833b3" path="/var/lib/kubelet/pods/c828d956-23f7-4720-9c34-63d6a33833b3/volumes" Oct 04 02:44:09 crc kubenswrapper[4964]: I1004 02:44:09.340719 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:44:09 crc kubenswrapper[4964]: I1004 02:44:09.406554 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:44:09 crc kubenswrapper[4964]: I1004 02:44:09.494898 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:44:09 crc kubenswrapper[4964]: I1004 02:44:09.532728 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:44:09 crc kubenswrapper[4964]: I1004 02:44:09.758375 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:44:09 crc kubenswrapper[4964]: I1004 02:44:09.940925 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:44:09 crc kubenswrapper[4964]: I1004 02:44:09.977778 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:44:11 crc kubenswrapper[4964]: I1004 02:44:11.481089 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:44:12 crc kubenswrapper[4964]: I1004 02:44:12.626983 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xzv28"] Oct 04 02:44:12 crc kubenswrapper[4964]: I1004 02:44:12.627365 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xzv28" podUID="b39c2152-b733-43f5-acd7-75e8948518f0" containerName="registry-server" containerID="cri-o://32bd3399c52b882c8bc07e007b111793c75b80a3a5fdbe7506ba521ffe353f74" gracePeriod=2 Oct 04 02:44:12 crc kubenswrapper[4964]: I1004 02:44:12.855217 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hw9j9"] Oct 04 02:44:12 crc kubenswrapper[4964]: I1004 02:44:12.855491 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hw9j9" podUID="a6858848-2cfa-4910-9078-6d94d3e875d5" containerName="registry-server" containerID="cri-o://d3961546d0e91dc48373cb07159d5858d0ae26bcaafdd4a4af2fef5dc6e032bd" gracePeriod=2 Oct 04 02:44:12 crc kubenswrapper[4964]: I1004 02:44:12.911477 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:44:12 crc kubenswrapper[4964]: I1004 02:44:12.953949 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:44:13 crc kubenswrapper[4964]: E1004 02:44:13.148333 4964 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6858848_2cfa_4910_9078_6d94d3e875d5.slice/crio-conmon-d3961546d0e91dc48373cb07159d5858d0ae26bcaafdd4a4af2fef5dc6e032bd.scope\": RecentStats: unable to find data in memory cache]" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.388859 4964 generic.go:334] "Generic (PLEG): container finished" podID="b39c2152-b733-43f5-acd7-75e8948518f0" containerID="32bd3399c52b882c8bc07e007b111793c75b80a3a5fdbe7506ba521ffe353f74" exitCode=0 Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.388942 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xzv28" event={"ID":"b39c2152-b733-43f5-acd7-75e8948518f0","Type":"ContainerDied","Data":"32bd3399c52b882c8bc07e007b111793c75b80a3a5fdbe7506ba521ffe353f74"} Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.391686 4964 generic.go:334] "Generic (PLEG): container finished" podID="a6858848-2cfa-4910-9078-6d94d3e875d5" containerID="d3961546d0e91dc48373cb07159d5858d0ae26bcaafdd4a4af2fef5dc6e032bd" exitCode=0 Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.391766 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw9j9" event={"ID":"a6858848-2cfa-4910-9078-6d94d3e875d5","Type":"ContainerDied","Data":"d3961546d0e91dc48373cb07159d5858d0ae26bcaafdd4a4af2fef5dc6e032bd"} Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.689182 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.767248 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.805161 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7cxd\" (UniqueName: \"kubernetes.io/projected/a6858848-2cfa-4910-9078-6d94d3e875d5-kube-api-access-v7cxd\") pod \"a6858848-2cfa-4910-9078-6d94d3e875d5\" (UID: \"a6858848-2cfa-4910-9078-6d94d3e875d5\") " Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.805233 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39c2152-b733-43f5-acd7-75e8948518f0-catalog-content\") pod \"b39c2152-b733-43f5-acd7-75e8948518f0\" (UID: \"b39c2152-b733-43f5-acd7-75e8948518f0\") " Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.805253 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4rvc\" (UniqueName: \"kubernetes.io/projected/b39c2152-b733-43f5-acd7-75e8948518f0-kube-api-access-j4rvc\") pod \"b39c2152-b733-43f5-acd7-75e8948518f0\" (UID: \"b39c2152-b733-43f5-acd7-75e8948518f0\") " Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.805301 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39c2152-b733-43f5-acd7-75e8948518f0-utilities\") pod \"b39c2152-b733-43f5-acd7-75e8948518f0\" (UID: \"b39c2152-b733-43f5-acd7-75e8948518f0\") " Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.805344 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6858848-2cfa-4910-9078-6d94d3e875d5-utilities\") pod \"a6858848-2cfa-4910-9078-6d94d3e875d5\" (UID: \"a6858848-2cfa-4910-9078-6d94d3e875d5\") " Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.805362 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6858848-2cfa-4910-9078-6d94d3e875d5-catalog-content\") pod \"a6858848-2cfa-4910-9078-6d94d3e875d5\" (UID: \"a6858848-2cfa-4910-9078-6d94d3e875d5\") " Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.806413 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6858848-2cfa-4910-9078-6d94d3e875d5-utilities" (OuterVolumeSpecName: "utilities") pod "a6858848-2cfa-4910-9078-6d94d3e875d5" (UID: "a6858848-2cfa-4910-9078-6d94d3e875d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.806659 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6858848-2cfa-4910-9078-6d94d3e875d5-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.809697 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39c2152-b733-43f5-acd7-75e8948518f0-utilities" (OuterVolumeSpecName: "utilities") pod "b39c2152-b733-43f5-acd7-75e8948518f0" (UID: "b39c2152-b733-43f5-acd7-75e8948518f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.810066 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6858848-2cfa-4910-9078-6d94d3e875d5-kube-api-access-v7cxd" (OuterVolumeSpecName: "kube-api-access-v7cxd") pod "a6858848-2cfa-4910-9078-6d94d3e875d5" (UID: "a6858848-2cfa-4910-9078-6d94d3e875d5"). InnerVolumeSpecName "kube-api-access-v7cxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.810585 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39c2152-b733-43f5-acd7-75e8948518f0-kube-api-access-j4rvc" (OuterVolumeSpecName: "kube-api-access-j4rvc") pod "b39c2152-b733-43f5-acd7-75e8948518f0" (UID: "b39c2152-b733-43f5-acd7-75e8948518f0"). InnerVolumeSpecName "kube-api-access-j4rvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.852077 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6858848-2cfa-4910-9078-6d94d3e875d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6858848-2cfa-4910-9078-6d94d3e875d5" (UID: "a6858848-2cfa-4910-9078-6d94d3e875d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.865314 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39c2152-b733-43f5-acd7-75e8948518f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b39c2152-b733-43f5-acd7-75e8948518f0" (UID: "b39c2152-b733-43f5-acd7-75e8948518f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.907781 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39c2152-b733-43f5-acd7-75e8948518f0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.907816 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4rvc\" (UniqueName: \"kubernetes.io/projected/b39c2152-b733-43f5-acd7-75e8948518f0-kube-api-access-j4rvc\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.907829 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39c2152-b733-43f5-acd7-75e8948518f0-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.907837 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6858848-2cfa-4910-9078-6d94d3e875d5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:13 crc kubenswrapper[4964]: I1004 02:44:13.907848 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7cxd\" (UniqueName: \"kubernetes.io/projected/a6858848-2cfa-4910-9078-6d94d3e875d5-kube-api-access-v7cxd\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.401845 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xzv28" event={"ID":"b39c2152-b733-43f5-acd7-75e8948518f0","Type":"ContainerDied","Data":"42b2453f3a76abcd98888ca9166737f1e9d3d3f46a81e848491dd94078f83913"} Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.402172 4964 scope.go:117] "RemoveContainer" containerID="32bd3399c52b882c8bc07e007b111793c75b80a3a5fdbe7506ba521ffe353f74" Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.401923 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xzv28" Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.407362 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw9j9" event={"ID":"a6858848-2cfa-4910-9078-6d94d3e875d5","Type":"ContainerDied","Data":"a02bd683587accb1194e8f06b4fefee1d23e70400c0b499540a906d13f33a3ed"} Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.407427 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hw9j9" Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.424506 4964 scope.go:117] "RemoveContainer" containerID="458e5627b3ac90dc60863823a3442f587e1e5e54880fe3ec03f7dc7ab1cae5db" Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.453216 4964 scope.go:117] "RemoveContainer" containerID="1af8fb64c4638f8f2592d7cf60a47f7d8510fb2688aa381cb09f29046d575cf4" Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.455004 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xzv28"] Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.459163 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xzv28"] Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.489982 4964 scope.go:117] "RemoveContainer" containerID="d3961546d0e91dc48373cb07159d5858d0ae26bcaafdd4a4af2fef5dc6e032bd" Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.494657 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hw9j9"] Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.495924 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hw9j9"] Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.510028 4964 scope.go:117] "RemoveContainer" containerID="b0cebf1f0dbb5451596bb37a177be43ffaf12990c4b0c9bdf113a5e18dc8964c" Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.527123 4964 scope.go:117] "RemoveContainer" containerID="d96b71f6221b168a08fb080ef4247feb2c16346a966ef3e7a8e1a9eda03ca064" Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.855388 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6858848-2cfa-4910-9078-6d94d3e875d5" path="/var/lib/kubelet/pods/a6858848-2cfa-4910-9078-6d94d3e875d5/volumes" Oct 04 02:44:14 crc kubenswrapper[4964]: I1004 02:44:14.856664 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b39c2152-b733-43f5-acd7-75e8948518f0" path="/var/lib/kubelet/pods/b39c2152-b733-43f5-acd7-75e8948518f0/volumes" Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.231873 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c8548"] Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.232342 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c8548" podUID="52d02de6-aa81-4246-bcea-838ed9fe84ed" containerName="registry-server" containerID="cri-o://6f4a9e266f8bc966c5c771326fdbf66ae0ff2dab0ded39872715fd26f2e9fa90" gracePeriod=2 Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.430584 4964 generic.go:334] "Generic (PLEG): container finished" podID="52d02de6-aa81-4246-bcea-838ed9fe84ed" containerID="6f4a9e266f8bc966c5c771326fdbf66ae0ff2dab0ded39872715fd26f2e9fa90" exitCode=0 Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.430858 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c8548" event={"ID":"52d02de6-aa81-4246-bcea-838ed9fe84ed","Type":"ContainerDied","Data":"6f4a9e266f8bc966c5c771326fdbf66ae0ff2dab0ded39872715fd26f2e9fa90"} Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.609749 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.651369 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d02de6-aa81-4246-bcea-838ed9fe84ed-catalog-content\") pod \"52d02de6-aa81-4246-bcea-838ed9fe84ed\" (UID: \"52d02de6-aa81-4246-bcea-838ed9fe84ed\") " Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.651538 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d02de6-aa81-4246-bcea-838ed9fe84ed-utilities\") pod \"52d02de6-aa81-4246-bcea-838ed9fe84ed\" (UID: \"52d02de6-aa81-4246-bcea-838ed9fe84ed\") " Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.651676 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfxwg\" (UniqueName: \"kubernetes.io/projected/52d02de6-aa81-4246-bcea-838ed9fe84ed-kube-api-access-bfxwg\") pod \"52d02de6-aa81-4246-bcea-838ed9fe84ed\" (UID: \"52d02de6-aa81-4246-bcea-838ed9fe84ed\") " Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.653958 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d02de6-aa81-4246-bcea-838ed9fe84ed-utilities" (OuterVolumeSpecName: "utilities") pod "52d02de6-aa81-4246-bcea-838ed9fe84ed" (UID: "52d02de6-aa81-4246-bcea-838ed9fe84ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.656636 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d02de6-aa81-4246-bcea-838ed9fe84ed-kube-api-access-bfxwg" (OuterVolumeSpecName: "kube-api-access-bfxwg") pod "52d02de6-aa81-4246-bcea-838ed9fe84ed" (UID: "52d02de6-aa81-4246-bcea-838ed9fe84ed"). InnerVolumeSpecName "kube-api-access-bfxwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.746334 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d02de6-aa81-4246-bcea-838ed9fe84ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52d02de6-aa81-4246-bcea-838ed9fe84ed" (UID: "52d02de6-aa81-4246-bcea-838ed9fe84ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.753218 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d02de6-aa81-4246-bcea-838ed9fe84ed-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.753246 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfxwg\" (UniqueName: \"kubernetes.io/projected/52d02de6-aa81-4246-bcea-838ed9fe84ed-kube-api-access-bfxwg\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:17 crc kubenswrapper[4964]: I1004 02:44:17.753257 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d02de6-aa81-4246-bcea-838ed9fe84ed-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:18 crc kubenswrapper[4964]: I1004 02:44:18.439446 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c8548" event={"ID":"52d02de6-aa81-4246-bcea-838ed9fe84ed","Type":"ContainerDied","Data":"6d857278b3b3c9b4dcf60dff125012e6b10a2897cec612d823c29fe290a27dbf"} Oct 04 02:44:18 crc kubenswrapper[4964]: I1004 02:44:18.439514 4964 scope.go:117] "RemoveContainer" containerID="6f4a9e266f8bc966c5c771326fdbf66ae0ff2dab0ded39872715fd26f2e9fa90" Oct 04 02:44:18 crc kubenswrapper[4964]: I1004 02:44:18.440516 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c8548" Oct 04 02:44:18 crc kubenswrapper[4964]: I1004 02:44:18.466968 4964 scope.go:117] "RemoveContainer" containerID="3708e31dbcf25777399b6898bafb19a06b51f4c543676a3d0ebe07b5e1059e7f" Oct 04 02:44:18 crc kubenswrapper[4964]: I1004 02:44:18.488871 4964 scope.go:117] "RemoveContainer" containerID="9a24fc1e64d51fb059eaa13188c8e26dcba190efe946971ef0bc87cb606c9a0a" Oct 04 02:44:18 crc kubenswrapper[4964]: I1004 02:44:18.489821 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c8548"] Oct 04 02:44:18 crc kubenswrapper[4964]: I1004 02:44:18.491836 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c8548"] Oct 04 02:44:18 crc kubenswrapper[4964]: I1004 02:44:18.854847 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52d02de6-aa81-4246-bcea-838ed9fe84ed" path="/var/lib/kubelet/pods/52d02de6-aa81-4246-bcea-838ed9fe84ed/volumes" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.051762 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" podUID="b564efb8-50db-4d84-a456-4857001ab84a" containerName="oauth-openshift" containerID="cri-o://d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db" gracePeriod=15 Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.485132 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.489777 4964 generic.go:334] "Generic (PLEG): container finished" podID="b564efb8-50db-4d84-a456-4857001ab84a" containerID="d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db" exitCode=0 Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.489849 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" event={"ID":"b564efb8-50db-4d84-a456-4857001ab84a","Type":"ContainerDied","Data":"d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db"} Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.489908 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" event={"ID":"b564efb8-50db-4d84-a456-4857001ab84a","Type":"ContainerDied","Data":"73bba6d1ee40bfa1ae61dfcdefd3905505ef0751c21fbff26be98b49f585eebb"} Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.489947 4964 scope.go:117] "RemoveContainer" containerID="d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.524519 4964 scope.go:117] "RemoveContainer" containerID="d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db" Oct 04 02:44:25 crc kubenswrapper[4964]: E1004 02:44:25.525255 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db\": container with ID starting with d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db not found: ID does not exist" containerID="d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.525300 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db"} err="failed to get container status \"d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db\": rpc error: code = NotFound desc = could not find container \"d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db\": container with ID starting with d1e9da36de58ece3e5865923a659594c8125c41213a77e28095eec472783a7db not found: ID does not exist" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560176 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-idp-0-file-data\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560230 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-service-ca\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560291 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-session\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560313 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b564efb8-50db-4d84-a456-4857001ab84a-audit-dir\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560343 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-error\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560398 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-ocp-branding-template\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560428 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-cliconfig\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560429 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b564efb8-50db-4d84-a456-4857001ab84a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560460 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-router-certs\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560493 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-provider-selection\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560521 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-serving-cert\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560564 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-audit-policies\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560587 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-login\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560633 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-trusted-ca-bundle\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560669 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkfr9\" (UniqueName: \"kubernetes.io/projected/b564efb8-50db-4d84-a456-4857001ab84a-kube-api-access-vkfr9\") pod \"b564efb8-50db-4d84-a456-4857001ab84a\" (UID: \"b564efb8-50db-4d84-a456-4857001ab84a\") " Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560884 4964 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b564efb8-50db-4d84-a456-4857001ab84a-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.560901 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.562464 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.562721 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.562829 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.565967 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.566121 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.566565 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.566675 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.566968 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.567723 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b564efb8-50db-4d84-a456-4857001ab84a-kube-api-access-vkfr9" (OuterVolumeSpecName: "kube-api-access-vkfr9") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "kube-api-access-vkfr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.567943 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.571127 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.571342 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b564efb8-50db-4d84-a456-4857001ab84a" (UID: "b564efb8-50db-4d84-a456-4857001ab84a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.661813 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.661870 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.661897 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.661917 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.661936 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.661989 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.662009 4964 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.662028 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.662046 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.662101 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkfr9\" (UniqueName: \"kubernetes.io/projected/b564efb8-50db-4d84-a456-4857001ab84a-kube-api-access-vkfr9\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.662120 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.662138 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:25 crc kubenswrapper[4964]: I1004 02:44:25.662156 4964 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b564efb8-50db-4d84-a456-4857001ab84a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.153182 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7b964c775c-6pr5h"] Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.153523 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6858848-2cfa-4910-9078-6d94d3e875d5" containerName="registry-server" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.153552 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6858848-2cfa-4910-9078-6d94d3e875d5" containerName="registry-server" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.153574 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d02de6-aa81-4246-bcea-838ed9fe84ed" containerName="extract-content" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.153589 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d02de6-aa81-4246-bcea-838ed9fe84ed" containerName="extract-content" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.153607 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d02de6-aa81-4246-bcea-838ed9fe84ed" containerName="registry-server" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.153655 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d02de6-aa81-4246-bcea-838ed9fe84ed" containerName="registry-server" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.153679 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6858848-2cfa-4910-9078-6d94d3e875d5" containerName="extract-utilities" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.153694 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6858848-2cfa-4910-9078-6d94d3e875d5" containerName="extract-utilities" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.153724 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c828d956-23f7-4720-9c34-63d6a33833b3" containerName="extract-utilities" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.153739 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c828d956-23f7-4720-9c34-63d6a33833b3" containerName="extract-utilities" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.153760 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b564efb8-50db-4d84-a456-4857001ab84a" containerName="oauth-openshift" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.153776 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="b564efb8-50db-4d84-a456-4857001ab84a" containerName="oauth-openshift" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.153807 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d02de6-aa81-4246-bcea-838ed9fe84ed" containerName="extract-utilities" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.153823 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d02de6-aa81-4246-bcea-838ed9fe84ed" containerName="extract-utilities" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.153845 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c828d956-23f7-4720-9c34-63d6a33833b3" containerName="registry-server" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.153863 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c828d956-23f7-4720-9c34-63d6a33833b3" containerName="registry-server" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.153888 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b21a779b-5ac3-45ab-bb35-cd476548696b" containerName="pruner" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.153904 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="b21a779b-5ac3-45ab-bb35-cd476548696b" containerName="pruner" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.153926 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39c2152-b733-43f5-acd7-75e8948518f0" containerName="extract-content" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.153941 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39c2152-b733-43f5-acd7-75e8948518f0" containerName="extract-content" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.153991 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39c2152-b733-43f5-acd7-75e8948518f0" containerName="registry-server" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.154009 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39c2152-b733-43f5-acd7-75e8948518f0" containerName="registry-server" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.154041 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772115d2-463f-4fba-a7c8-05aa4e460a22" containerName="pruner" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.154056 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="772115d2-463f-4fba-a7c8-05aa4e460a22" containerName="pruner" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.154080 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39c2152-b733-43f5-acd7-75e8948518f0" containerName="extract-utilities" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.154098 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39c2152-b733-43f5-acd7-75e8948518f0" containerName="extract-utilities" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.154115 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c828d956-23f7-4720-9c34-63d6a33833b3" containerName="extract-content" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.154134 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c828d956-23f7-4720-9c34-63d6a33833b3" containerName="extract-content" Oct 04 02:44:26 crc kubenswrapper[4964]: E1004 02:44:26.154156 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6858848-2cfa-4910-9078-6d94d3e875d5" containerName="extract-content" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.154171 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6858848-2cfa-4910-9078-6d94d3e875d5" containerName="extract-content" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.154413 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39c2152-b733-43f5-acd7-75e8948518f0" containerName="registry-server" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.154447 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="b564efb8-50db-4d84-a456-4857001ab84a" containerName="oauth-openshift" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.154470 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6858848-2cfa-4910-9078-6d94d3e875d5" containerName="registry-server" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.154493 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d02de6-aa81-4246-bcea-838ed9fe84ed" containerName="registry-server" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.154520 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="b21a779b-5ac3-45ab-bb35-cd476548696b" containerName="pruner" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.154540 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c828d956-23f7-4720-9c34-63d6a33833b3" containerName="registry-server" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.154554 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="772115d2-463f-4fba-a7c8-05aa4e460a22" containerName="pruner" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.155187 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.168712 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-session\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.168804 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-router-certs\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.168869 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hvsd\" (UniqueName: \"kubernetes.io/projected/b6256df0-1548-4661-8d81-dbbe50256716-kube-api-access-9hvsd\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.168932 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b6256df0-1548-4661-8d81-dbbe50256716-audit-policies\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.168984 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.169044 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-user-template-login\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.169096 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.169147 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b6256df0-1548-4661-8d81-dbbe50256716-audit-dir\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.169204 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-user-template-error\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.169256 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-service-ca\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.169311 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.169358 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.169416 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.169462 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.171122 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7b964c775c-6pr5h"] Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271247 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-service-ca\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271347 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271387 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271434 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271472 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271566 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-session\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271597 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-router-certs\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271683 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hvsd\" (UniqueName: \"kubernetes.io/projected/b6256df0-1548-4661-8d81-dbbe50256716-kube-api-access-9hvsd\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271730 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b6256df0-1548-4661-8d81-dbbe50256716-audit-policies\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271763 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271802 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-user-template-login\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271839 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271875 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b6256df0-1548-4661-8d81-dbbe50256716-audit-dir\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.271916 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-user-template-error\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.272336 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-service-ca\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.272743 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.273328 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b6256df0-1548-4661-8d81-dbbe50256716-audit-dir\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.274116 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.274249 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b6256df0-1548-4661-8d81-dbbe50256716-audit-policies\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.278196 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.278341 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-session\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.280087 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-router-certs\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.280141 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.280275 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.280768 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-user-template-error\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.282606 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.282712 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b6256df0-1548-4661-8d81-dbbe50256716-v4-0-config-user-template-login\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.296602 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hvsd\" (UniqueName: \"kubernetes.io/projected/b6256df0-1548-4661-8d81-dbbe50256716-kube-api-access-9hvsd\") pod \"oauth-openshift-7b964c775c-6pr5h\" (UID: \"b6256df0-1548-4661-8d81-dbbe50256716\") " pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.498450 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2zdrf" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.506064 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.607821 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zdrf"] Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.613635 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zdrf"] Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.798509 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7b964c775c-6pr5h"] Oct 04 02:44:26 crc kubenswrapper[4964]: W1004 02:44:26.808202 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6256df0_1548_4661_8d81_dbbe50256716.slice/crio-7868a3226a70c7863c90a88861ccd1560bf1207704cbafc84805ec51a7534a07 WatchSource:0}: Error finding container 7868a3226a70c7863c90a88861ccd1560bf1207704cbafc84805ec51a7534a07: Status 404 returned error can't find the container with id 7868a3226a70c7863c90a88861ccd1560bf1207704cbafc84805ec51a7534a07 Oct 04 02:44:26 crc kubenswrapper[4964]: I1004 02:44:26.857036 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b564efb8-50db-4d84-a456-4857001ab84a" path="/var/lib/kubelet/pods/b564efb8-50db-4d84-a456-4857001ab84a/volumes" Oct 04 02:44:27 crc kubenswrapper[4964]: I1004 02:44:27.507586 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" event={"ID":"b6256df0-1548-4661-8d81-dbbe50256716","Type":"ContainerStarted","Data":"a88dabd125148029dfbc8392425b4496879a39ffc1c121b68a2ffc763e147d73"} Oct 04 02:44:27 crc kubenswrapper[4964]: I1004 02:44:27.508158 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:27 crc kubenswrapper[4964]: I1004 02:44:27.508195 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" event={"ID":"b6256df0-1548-4661-8d81-dbbe50256716","Type":"ContainerStarted","Data":"7868a3226a70c7863c90a88861ccd1560bf1207704cbafc84805ec51a7534a07"} Oct 04 02:44:27 crc kubenswrapper[4964]: I1004 02:44:27.524541 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" Oct 04 02:44:27 crc kubenswrapper[4964]: I1004 02:44:27.538022 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7b964c775c-6pr5h" podStartSLOduration=27.538007393 podStartE2EDuration="27.538007393s" podCreationTimestamp="2025-10-04 02:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:44:27.534563802 +0000 UTC m=+247.431522470" watchObservedRunningTime="2025-10-04 02:44:27.538007393 +0000 UTC m=+247.434966041" Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.690958 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8k555"] Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.692536 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8k555" podUID="29d99f96-8f30-452b-9ca2-1f0c640380f8" containerName="registry-server" containerID="cri-o://a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0" gracePeriod=30 Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.700727 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9gg8f"] Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.701297 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9gg8f" podUID="698d3183-93e5-4693-8b24-cc507a41d274" containerName="registry-server" containerID="cri-o://ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396" gracePeriod=30 Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.709790 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d78cz"] Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.709964 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" podUID="bb432ba7-089d-40a7-a0a7-43b3217a2527" containerName="marketplace-operator" containerID="cri-o://e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c" gracePeriod=30 Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.715570 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kljsx"] Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.715810 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kljsx" podUID="e1213b9a-e51e-4af9-835b-a39b5378ed60" containerName="registry-server" containerID="cri-o://647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753" gracePeriod=30 Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.726383 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6gfn4"] Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.726655 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6gfn4" podUID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" containerName="registry-server" containerID="cri-o://c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0" gracePeriod=30 Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.731541 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kqprv"] Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.732326 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.743777 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kqprv"] Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.869303 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0da7f331-58e4-48cf-abb4-a8c8fd7b137b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kqprv\" (UID: \"0da7f331-58e4-48cf-abb4-a8c8fd7b137b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.869466 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0da7f331-58e4-48cf-abb4-a8c8fd7b137b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kqprv\" (UID: \"0da7f331-58e4-48cf-abb4-a8c8fd7b137b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.869657 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xx64\" (UniqueName: \"kubernetes.io/projected/0da7f331-58e4-48cf-abb4-a8c8fd7b137b-kube-api-access-2xx64\") pod \"marketplace-operator-79b997595-kqprv\" (UID: \"0da7f331-58e4-48cf-abb4-a8c8fd7b137b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.970946 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0da7f331-58e4-48cf-abb4-a8c8fd7b137b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kqprv\" (UID: \"0da7f331-58e4-48cf-abb4-a8c8fd7b137b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.971018 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xx64\" (UniqueName: \"kubernetes.io/projected/0da7f331-58e4-48cf-abb4-a8c8fd7b137b-kube-api-access-2xx64\") pod \"marketplace-operator-79b997595-kqprv\" (UID: \"0da7f331-58e4-48cf-abb4-a8c8fd7b137b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.971039 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0da7f331-58e4-48cf-abb4-a8c8fd7b137b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kqprv\" (UID: \"0da7f331-58e4-48cf-abb4-a8c8fd7b137b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.972504 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0da7f331-58e4-48cf-abb4-a8c8fd7b137b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kqprv\" (UID: \"0da7f331-58e4-48cf-abb4-a8c8fd7b137b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.978206 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0da7f331-58e4-48cf-abb4-a8c8fd7b137b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kqprv\" (UID: \"0da7f331-58e4-48cf-abb4-a8c8fd7b137b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:40 crc kubenswrapper[4964]: I1004 02:44:40.992836 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xx64\" (UniqueName: \"kubernetes.io/projected/0da7f331-58e4-48cf-abb4-a8c8fd7b137b-kube-api-access-2xx64\") pod \"marketplace-operator-79b997595-kqprv\" (UID: \"0da7f331-58e4-48cf-abb4-a8c8fd7b137b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.094914 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.098952 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.189965 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.200979 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.215024 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.222778 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.278671 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztbpw\" (UniqueName: \"kubernetes.io/projected/29d99f96-8f30-452b-9ca2-1f0c640380f8-kube-api-access-ztbpw\") pod \"29d99f96-8f30-452b-9ca2-1f0c640380f8\" (UID: \"29d99f96-8f30-452b-9ca2-1f0c640380f8\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.278740 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29d99f96-8f30-452b-9ca2-1f0c640380f8-utilities\") pod \"29d99f96-8f30-452b-9ca2-1f0c640380f8\" (UID: \"29d99f96-8f30-452b-9ca2-1f0c640380f8\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.278811 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29d99f96-8f30-452b-9ca2-1f0c640380f8-catalog-content\") pod \"29d99f96-8f30-452b-9ca2-1f0c640380f8\" (UID: \"29d99f96-8f30-452b-9ca2-1f0c640380f8\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.280716 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29d99f96-8f30-452b-9ca2-1f0c640380f8-utilities" (OuterVolumeSpecName: "utilities") pod "29d99f96-8f30-452b-9ca2-1f0c640380f8" (UID: "29d99f96-8f30-452b-9ca2-1f0c640380f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.301860 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d99f96-8f30-452b-9ca2-1f0c640380f8-kube-api-access-ztbpw" (OuterVolumeSpecName: "kube-api-access-ztbpw") pod "29d99f96-8f30-452b-9ca2-1f0c640380f8" (UID: "29d99f96-8f30-452b-9ca2-1f0c640380f8"). InnerVolumeSpecName "kube-api-access-ztbpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.312461 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kqprv"] Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.327641 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29d99f96-8f30-452b-9ca2-1f0c640380f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29d99f96-8f30-452b-9ca2-1f0c640380f8" (UID: "29d99f96-8f30-452b-9ca2-1f0c640380f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.381356 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-utilities\") pod \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\" (UID: \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.381537 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698d3183-93e5-4693-8b24-cc507a41d274-utilities\") pod \"698d3183-93e5-4693-8b24-cc507a41d274\" (UID: \"698d3183-93e5-4693-8b24-cc507a41d274\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.381662 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb432ba7-089d-40a7-a0a7-43b3217a2527-marketplace-trusted-ca\") pod \"bb432ba7-089d-40a7-a0a7-43b3217a2527\" (UID: \"bb432ba7-089d-40a7-a0a7-43b3217a2527\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.381767 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjr4b\" (UniqueName: \"kubernetes.io/projected/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-kube-api-access-rjr4b\") pod \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\" (UID: \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.381862 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb432ba7-089d-40a7-a0a7-43b3217a2527-marketplace-operator-metrics\") pod \"bb432ba7-089d-40a7-a0a7-43b3217a2527\" (UID: \"bb432ba7-089d-40a7-a0a7-43b3217a2527\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.381965 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-catalog-content\") pod \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\" (UID: \"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.382054 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfvc7\" (UniqueName: \"kubernetes.io/projected/e1213b9a-e51e-4af9-835b-a39b5378ed60-kube-api-access-sfvc7\") pod \"e1213b9a-e51e-4af9-835b-a39b5378ed60\" (UID: \"e1213b9a-e51e-4af9-835b-a39b5378ed60\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.382171 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698d3183-93e5-4693-8b24-cc507a41d274-catalog-content\") pod \"698d3183-93e5-4693-8b24-cc507a41d274\" (UID: \"698d3183-93e5-4693-8b24-cc507a41d274\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.390810 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1213b9a-e51e-4af9-835b-a39b5378ed60-utilities\") pod \"e1213b9a-e51e-4af9-835b-a39b5378ed60\" (UID: \"e1213b9a-e51e-4af9-835b-a39b5378ed60\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.382067 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-utilities" (OuterVolumeSpecName: "utilities") pod "e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" (UID: "e1e0aa21-f10c-449b-898a-6cb1e10c8bf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.382880 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698d3183-93e5-4693-8b24-cc507a41d274-utilities" (OuterVolumeSpecName: "utilities") pod "698d3183-93e5-4693-8b24-cc507a41d274" (UID: "698d3183-93e5-4693-8b24-cc507a41d274"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.383218 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb432ba7-089d-40a7-a0a7-43b3217a2527-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "bb432ba7-089d-40a7-a0a7-43b3217a2527" (UID: "bb432ba7-089d-40a7-a0a7-43b3217a2527"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.386897 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1213b9a-e51e-4af9-835b-a39b5378ed60-kube-api-access-sfvc7" (OuterVolumeSpecName: "kube-api-access-sfvc7") pod "e1213b9a-e51e-4af9-835b-a39b5378ed60" (UID: "e1213b9a-e51e-4af9-835b-a39b5378ed60"). InnerVolumeSpecName "kube-api-access-sfvc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.386929 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-kube-api-access-rjr4b" (OuterVolumeSpecName: "kube-api-access-rjr4b") pod "e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" (UID: "e1e0aa21-f10c-449b-898a-6cb1e10c8bf6"). InnerVolumeSpecName "kube-api-access-rjr4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.386956 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb432ba7-089d-40a7-a0a7-43b3217a2527-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "bb432ba7-089d-40a7-a0a7-43b3217a2527" (UID: "bb432ba7-089d-40a7-a0a7-43b3217a2527"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.391141 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcndk\" (UniqueName: \"kubernetes.io/projected/bb432ba7-089d-40a7-a0a7-43b3217a2527-kube-api-access-tcndk\") pod \"bb432ba7-089d-40a7-a0a7-43b3217a2527\" (UID: \"bb432ba7-089d-40a7-a0a7-43b3217a2527\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.391286 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1213b9a-e51e-4af9-835b-a39b5378ed60-catalog-content\") pod \"e1213b9a-e51e-4af9-835b-a39b5378ed60\" (UID: \"e1213b9a-e51e-4af9-835b-a39b5378ed60\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.392056 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78pn6\" (UniqueName: \"kubernetes.io/projected/698d3183-93e5-4693-8b24-cc507a41d274-kube-api-access-78pn6\") pod \"698d3183-93e5-4693-8b24-cc507a41d274\" (UID: \"698d3183-93e5-4693-8b24-cc507a41d274\") " Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.392382 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1213b9a-e51e-4af9-835b-a39b5378ed60-utilities" (OuterVolumeSpecName: "utilities") pod "e1213b9a-e51e-4af9-835b-a39b5378ed60" (UID: "e1213b9a-e51e-4af9-835b-a39b5378ed60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.394306 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb432ba7-089d-40a7-a0a7-43b3217a2527-kube-api-access-tcndk" (OuterVolumeSpecName: "kube-api-access-tcndk") pod "bb432ba7-089d-40a7-a0a7-43b3217a2527" (UID: "bb432ba7-089d-40a7-a0a7-43b3217a2527"). InnerVolumeSpecName "kube-api-access-tcndk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.395380 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698d3183-93e5-4693-8b24-cc507a41d274-kube-api-access-78pn6" (OuterVolumeSpecName: "kube-api-access-78pn6") pod "698d3183-93e5-4693-8b24-cc507a41d274" (UID: "698d3183-93e5-4693-8b24-cc507a41d274"). InnerVolumeSpecName "kube-api-access-78pn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.397791 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfvc7\" (UniqueName: \"kubernetes.io/projected/e1213b9a-e51e-4af9-835b-a39b5378ed60-kube-api-access-sfvc7\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.397819 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1213b9a-e51e-4af9-835b-a39b5378ed60-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.397832 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcndk\" (UniqueName: \"kubernetes.io/projected/bb432ba7-089d-40a7-a0a7-43b3217a2527-kube-api-access-tcndk\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.397844 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztbpw\" (UniqueName: \"kubernetes.io/projected/29d99f96-8f30-452b-9ca2-1f0c640380f8-kube-api-access-ztbpw\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.397859 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29d99f96-8f30-452b-9ca2-1f0c640380f8-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.397870 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29d99f96-8f30-452b-9ca2-1f0c640380f8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.397882 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78pn6\" (UniqueName: \"kubernetes.io/projected/698d3183-93e5-4693-8b24-cc507a41d274-kube-api-access-78pn6\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.397895 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.397906 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698d3183-93e5-4693-8b24-cc507a41d274-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.397920 4964 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb432ba7-089d-40a7-a0a7-43b3217a2527-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.397932 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjr4b\" (UniqueName: \"kubernetes.io/projected/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-kube-api-access-rjr4b\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.397943 4964 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb432ba7-089d-40a7-a0a7-43b3217a2527-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.412983 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1213b9a-e51e-4af9-835b-a39b5378ed60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1213b9a-e51e-4af9-835b-a39b5378ed60" (UID: "e1213b9a-e51e-4af9-835b-a39b5378ed60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.439492 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698d3183-93e5-4693-8b24-cc507a41d274-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "698d3183-93e5-4693-8b24-cc507a41d274" (UID: "698d3183-93e5-4693-8b24-cc507a41d274"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.467268 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" (UID: "e1e0aa21-f10c-449b-898a-6cb1e10c8bf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.499104 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.499140 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698d3183-93e5-4693-8b24-cc507a41d274-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.499152 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1213b9a-e51e-4af9-835b-a39b5378ed60-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.584084 4964 generic.go:334] "Generic (PLEG): container finished" podID="bb432ba7-089d-40a7-a0a7-43b3217a2527" containerID="e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c" exitCode=0 Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.584149 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" event={"ID":"bb432ba7-089d-40a7-a0a7-43b3217a2527","Type":"ContainerDied","Data":"e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c"} Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.584174 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" event={"ID":"bb432ba7-089d-40a7-a0a7-43b3217a2527","Type":"ContainerDied","Data":"f4f9bfc3728c5f57a6557e04ae3ca020186103a5f7c8c54f61b2b433dcd98d7a"} Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.584194 4964 scope.go:117] "RemoveContainer" containerID="e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.584573 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d78cz" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.586576 4964 generic.go:334] "Generic (PLEG): container finished" podID="698d3183-93e5-4693-8b24-cc507a41d274" containerID="ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396" exitCode=0 Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.586650 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gg8f" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.586658 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gg8f" event={"ID":"698d3183-93e5-4693-8b24-cc507a41d274","Type":"ContainerDied","Data":"ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396"} Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.587099 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gg8f" event={"ID":"698d3183-93e5-4693-8b24-cc507a41d274","Type":"ContainerDied","Data":"7747e0f37c1a9d79122f92b2f4dfe9acf0094eea69c7f515f7488bdf58076cf7"} Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.589593 4964 generic.go:334] "Generic (PLEG): container finished" podID="e1213b9a-e51e-4af9-835b-a39b5378ed60" containerID="647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753" exitCode=0 Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.589676 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kljsx" event={"ID":"e1213b9a-e51e-4af9-835b-a39b5378ed60","Type":"ContainerDied","Data":"647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753"} Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.589690 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kljsx" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.589705 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kljsx" event={"ID":"e1213b9a-e51e-4af9-835b-a39b5378ed60","Type":"ContainerDied","Data":"c53288fe0d2d27fae5a344e21331e973587c1ea2211878d88fb0c8364f4569f1"} Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.593059 4964 generic.go:334] "Generic (PLEG): container finished" podID="29d99f96-8f30-452b-9ca2-1f0c640380f8" containerID="a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0" exitCode=0 Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.593106 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k555" event={"ID":"29d99f96-8f30-452b-9ca2-1f0c640380f8","Type":"ContainerDied","Data":"a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0"} Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.593126 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k555" event={"ID":"29d99f96-8f30-452b-9ca2-1f0c640380f8","Type":"ContainerDied","Data":"a2dd474d4840c71f88827a5e683cb04f9ec146c2ba5c453a322cfe7991f757cc"} Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.593176 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8k555" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.594813 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" event={"ID":"0da7f331-58e4-48cf-abb4-a8c8fd7b137b","Type":"ContainerStarted","Data":"d1757336383e844e11442289497edf87bfc28c9e6c64aae0c69fd0667817a6a2"} Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.594857 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" event={"ID":"0da7f331-58e4-48cf-abb4-a8c8fd7b137b","Type":"ContainerStarted","Data":"b506fb3ed4501988216627ffb5af099e2bb2ceef34885c18f3d627312c521e24"} Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.594957 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.597679 4964 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kqprv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.597718 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" podUID="0da7f331-58e4-48cf-abb4-a8c8fd7b137b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.599062 4964 scope.go:117] "RemoveContainer" containerID="e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c" Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.599324 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c\": container with ID starting with e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c not found: ID does not exist" containerID="e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.599375 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c"} err="failed to get container status \"e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c\": rpc error: code = NotFound desc = could not find container \"e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c\": container with ID starting with e11c958fdc814e66294c380959efa2654cea41e67a3578ae9d61180a13c7ce8c not found: ID does not exist" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.599402 4964 scope.go:117] "RemoveContainer" containerID="ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.601415 4964 generic.go:334] "Generic (PLEG): container finished" podID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" containerID="c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0" exitCode=0 Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.601452 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gfn4" event={"ID":"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6","Type":"ContainerDied","Data":"c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0"} Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.601478 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gfn4" event={"ID":"e1e0aa21-f10c-449b-898a-6cb1e10c8bf6","Type":"ContainerDied","Data":"efeb7227e7895f9f4f79d291b2395e9dc57631717b04aaca245da3a6b98edb39"} Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.601526 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gfn4" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.626936 4964 scope.go:117] "RemoveContainer" containerID="9ea4d624e17d3389fa92665d77e191691dd977afc9b589c962df3c757048e29f" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.642133 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" podStartSLOduration=1.642113405 podStartE2EDuration="1.642113405s" podCreationTimestamp="2025-10-04 02:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:44:41.627013311 +0000 UTC m=+261.523971959" watchObservedRunningTime="2025-10-04 02:44:41.642113405 +0000 UTC m=+261.539072043" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.658681 4964 scope.go:117] "RemoveContainer" containerID="fbd8235582b3706b172c24b1ee2b892ceae05ce344f453bb49e333acdfb09f12" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.664726 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d78cz"] Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.669686 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d78cz"] Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.693395 4964 scope.go:117] "RemoveContainer" containerID="ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.693697 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9gg8f"] Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.693963 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396\": container with ID starting with ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396 not found: ID does not exist" containerID="ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.694018 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396"} err="failed to get container status \"ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396\": rpc error: code = NotFound desc = could not find container \"ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396\": container with ID starting with ef967c97f130e9c95a38d6efdf6eec57c135b9cabc764d5789bf0dac0e1f8396 not found: ID does not exist" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.694049 4964 scope.go:117] "RemoveContainer" containerID="9ea4d624e17d3389fa92665d77e191691dd977afc9b589c962df3c757048e29f" Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.694421 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea4d624e17d3389fa92665d77e191691dd977afc9b589c962df3c757048e29f\": container with ID starting with 9ea4d624e17d3389fa92665d77e191691dd977afc9b589c962df3c757048e29f not found: ID does not exist" containerID="9ea4d624e17d3389fa92665d77e191691dd977afc9b589c962df3c757048e29f" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.694477 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea4d624e17d3389fa92665d77e191691dd977afc9b589c962df3c757048e29f"} err="failed to get container status \"9ea4d624e17d3389fa92665d77e191691dd977afc9b589c962df3c757048e29f\": rpc error: code = NotFound desc = could not find container \"9ea4d624e17d3389fa92665d77e191691dd977afc9b589c962df3c757048e29f\": container with ID starting with 9ea4d624e17d3389fa92665d77e191691dd977afc9b589c962df3c757048e29f not found: ID does not exist" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.694498 4964 scope.go:117] "RemoveContainer" containerID="fbd8235582b3706b172c24b1ee2b892ceae05ce344f453bb49e333acdfb09f12" Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.694730 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbd8235582b3706b172c24b1ee2b892ceae05ce344f453bb49e333acdfb09f12\": container with ID starting with fbd8235582b3706b172c24b1ee2b892ceae05ce344f453bb49e333acdfb09f12 not found: ID does not exist" containerID="fbd8235582b3706b172c24b1ee2b892ceae05ce344f453bb49e333acdfb09f12" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.694752 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbd8235582b3706b172c24b1ee2b892ceae05ce344f453bb49e333acdfb09f12"} err="failed to get container status \"fbd8235582b3706b172c24b1ee2b892ceae05ce344f453bb49e333acdfb09f12\": rpc error: code = NotFound desc = could not find container \"fbd8235582b3706b172c24b1ee2b892ceae05ce344f453bb49e333acdfb09f12\": container with ID starting with fbd8235582b3706b172c24b1ee2b892ceae05ce344f453bb49e333acdfb09f12 not found: ID does not exist" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.694764 4964 scope.go:117] "RemoveContainer" containerID="647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.706295 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9gg8f"] Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.709884 4964 scope.go:117] "RemoveContainer" containerID="2381dbca20bd7c5d44b8656328c0ceb912e8e8382413f5aee757e82c541090b1" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.710009 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8k555"] Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.712641 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8k555"] Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.716318 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kljsx"] Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.722009 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kljsx"] Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.722050 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6gfn4"] Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.723594 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6gfn4"] Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.729053 4964 scope.go:117] "RemoveContainer" containerID="4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.745470 4964 scope.go:117] "RemoveContainer" containerID="647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753" Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.746079 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753\": container with ID starting with 647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753 not found: ID does not exist" containerID="647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.746118 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753"} err="failed to get container status \"647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753\": rpc error: code = NotFound desc = could not find container \"647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753\": container with ID starting with 647ec4274350f2ea692b8d7012dbfdc03d1ba1cbd06d2710a9af79c234fde753 not found: ID does not exist" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.746144 4964 scope.go:117] "RemoveContainer" containerID="2381dbca20bd7c5d44b8656328c0ceb912e8e8382413f5aee757e82c541090b1" Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.746555 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2381dbca20bd7c5d44b8656328c0ceb912e8e8382413f5aee757e82c541090b1\": container with ID starting with 2381dbca20bd7c5d44b8656328c0ceb912e8e8382413f5aee757e82c541090b1 not found: ID does not exist" containerID="2381dbca20bd7c5d44b8656328c0ceb912e8e8382413f5aee757e82c541090b1" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.746643 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2381dbca20bd7c5d44b8656328c0ceb912e8e8382413f5aee757e82c541090b1"} err="failed to get container status \"2381dbca20bd7c5d44b8656328c0ceb912e8e8382413f5aee757e82c541090b1\": rpc error: code = NotFound desc = could not find container \"2381dbca20bd7c5d44b8656328c0ceb912e8e8382413f5aee757e82c541090b1\": container with ID starting with 2381dbca20bd7c5d44b8656328c0ceb912e8e8382413f5aee757e82c541090b1 not found: ID does not exist" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.746678 4964 scope.go:117] "RemoveContainer" containerID="4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6" Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.746940 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6\": container with ID starting with 4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6 not found: ID does not exist" containerID="4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.746961 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6"} err="failed to get container status \"4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6\": rpc error: code = NotFound desc = could not find container \"4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6\": container with ID starting with 4d1d674c63b3ad694d476ade0f85c8490770b5dceed6306aa1006dc5be6dbcf6 not found: ID does not exist" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.746976 4964 scope.go:117] "RemoveContainer" containerID="a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.759576 4964 scope.go:117] "RemoveContainer" containerID="f94770f5f14ec4b717bc20f26535cbf4ec7aea38ac224bcc652d2f5766505e87" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.772417 4964 scope.go:117] "RemoveContainer" containerID="be8597e169aeb4bd12bf05772354bc69f2315d1791bb32e570759118ca04e941" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.789046 4964 scope.go:117] "RemoveContainer" containerID="a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0" Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.789510 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0\": container with ID starting with a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0 not found: ID does not exist" containerID="a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.789550 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0"} err="failed to get container status \"a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0\": rpc error: code = NotFound desc = could not find container \"a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0\": container with ID starting with a0788187527fa0dd6a45cac8a147a64a96e78a01da4da3c6fb75982193f678c0 not found: ID does not exist" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.789577 4964 scope.go:117] "RemoveContainer" containerID="f94770f5f14ec4b717bc20f26535cbf4ec7aea38ac224bcc652d2f5766505e87" Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.790528 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f94770f5f14ec4b717bc20f26535cbf4ec7aea38ac224bcc652d2f5766505e87\": container with ID starting with f94770f5f14ec4b717bc20f26535cbf4ec7aea38ac224bcc652d2f5766505e87 not found: ID does not exist" containerID="f94770f5f14ec4b717bc20f26535cbf4ec7aea38ac224bcc652d2f5766505e87" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.790587 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94770f5f14ec4b717bc20f26535cbf4ec7aea38ac224bcc652d2f5766505e87"} err="failed to get container status \"f94770f5f14ec4b717bc20f26535cbf4ec7aea38ac224bcc652d2f5766505e87\": rpc error: code = NotFound desc = could not find container \"f94770f5f14ec4b717bc20f26535cbf4ec7aea38ac224bcc652d2f5766505e87\": container with ID starting with f94770f5f14ec4b717bc20f26535cbf4ec7aea38ac224bcc652d2f5766505e87 not found: ID does not exist" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.790646 4964 scope.go:117] "RemoveContainer" containerID="be8597e169aeb4bd12bf05772354bc69f2315d1791bb32e570759118ca04e941" Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.791100 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be8597e169aeb4bd12bf05772354bc69f2315d1791bb32e570759118ca04e941\": container with ID starting with be8597e169aeb4bd12bf05772354bc69f2315d1791bb32e570759118ca04e941 not found: ID does not exist" containerID="be8597e169aeb4bd12bf05772354bc69f2315d1791bb32e570759118ca04e941" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.791129 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be8597e169aeb4bd12bf05772354bc69f2315d1791bb32e570759118ca04e941"} err="failed to get container status \"be8597e169aeb4bd12bf05772354bc69f2315d1791bb32e570759118ca04e941\": rpc error: code = NotFound desc = could not find container \"be8597e169aeb4bd12bf05772354bc69f2315d1791bb32e570759118ca04e941\": container with ID starting with be8597e169aeb4bd12bf05772354bc69f2315d1791bb32e570759118ca04e941 not found: ID does not exist" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.791149 4964 scope.go:117] "RemoveContainer" containerID="c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.803838 4964 scope.go:117] "RemoveContainer" containerID="5e0fee700df337b740a0a7b62a7b6b1c8c3d05bf2383e02be7b7457c2f6a0332" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.829666 4964 scope.go:117] "RemoveContainer" containerID="1858754f8ac97f2fe9a40b1e13a620dd7234ee0f87d7358db1112127779efb42" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.881702 4964 scope.go:117] "RemoveContainer" containerID="c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0" Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.882230 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0\": container with ID starting with c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0 not found: ID does not exist" containerID="c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.882267 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0"} err="failed to get container status \"c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0\": rpc error: code = NotFound desc = could not find container \"c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0\": container with ID starting with c61fbcbd30503ee62bbbda82d0ff881e0eebf6254c924cc2e527a402ae92ddc0 not found: ID does not exist" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.882289 4964 scope.go:117] "RemoveContainer" containerID="5e0fee700df337b740a0a7b62a7b6b1c8c3d05bf2383e02be7b7457c2f6a0332" Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.882694 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e0fee700df337b740a0a7b62a7b6b1c8c3d05bf2383e02be7b7457c2f6a0332\": container with ID starting with 5e0fee700df337b740a0a7b62a7b6b1c8c3d05bf2383e02be7b7457c2f6a0332 not found: ID does not exist" containerID="5e0fee700df337b740a0a7b62a7b6b1c8c3d05bf2383e02be7b7457c2f6a0332" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.882736 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e0fee700df337b740a0a7b62a7b6b1c8c3d05bf2383e02be7b7457c2f6a0332"} err="failed to get container status \"5e0fee700df337b740a0a7b62a7b6b1c8c3d05bf2383e02be7b7457c2f6a0332\": rpc error: code = NotFound desc = could not find container \"5e0fee700df337b740a0a7b62a7b6b1c8c3d05bf2383e02be7b7457c2f6a0332\": container with ID starting with 5e0fee700df337b740a0a7b62a7b6b1c8c3d05bf2383e02be7b7457c2f6a0332 not found: ID does not exist" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.882764 4964 scope.go:117] "RemoveContainer" containerID="1858754f8ac97f2fe9a40b1e13a620dd7234ee0f87d7358db1112127779efb42" Oct 04 02:44:41 crc kubenswrapper[4964]: E1004 02:44:41.883019 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1858754f8ac97f2fe9a40b1e13a620dd7234ee0f87d7358db1112127779efb42\": container with ID starting with 1858754f8ac97f2fe9a40b1e13a620dd7234ee0f87d7358db1112127779efb42 not found: ID does not exist" containerID="1858754f8ac97f2fe9a40b1e13a620dd7234ee0f87d7358db1112127779efb42" Oct 04 02:44:41 crc kubenswrapper[4964]: I1004 02:44:41.883037 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1858754f8ac97f2fe9a40b1e13a620dd7234ee0f87d7358db1112127779efb42"} err="failed to get container status \"1858754f8ac97f2fe9a40b1e13a620dd7234ee0f87d7358db1112127779efb42\": rpc error: code = NotFound desc = could not find container \"1858754f8ac97f2fe9a40b1e13a620dd7234ee0f87d7358db1112127779efb42\": container with ID starting with 1858754f8ac97f2fe9a40b1e13a620dd7234ee0f87d7358db1112127779efb42 not found: ID does not exist" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.614802 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kqprv" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.708196 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zqv28"] Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.709248 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698d3183-93e5-4693-8b24-cc507a41d274" containerName="extract-content" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.709343 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="698d3183-93e5-4693-8b24-cc507a41d274" containerName="extract-content" Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.709406 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" containerName="extract-utilities" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.709457 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" containerName="extract-utilities" Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.709510 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1213b9a-e51e-4af9-835b-a39b5378ed60" containerName="registry-server" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.709566 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1213b9a-e51e-4af9-835b-a39b5378ed60" containerName="registry-server" Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.709638 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d99f96-8f30-452b-9ca2-1f0c640380f8" containerName="extract-utilities" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.709691 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d99f96-8f30-452b-9ca2-1f0c640380f8" containerName="extract-utilities" Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.709752 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb432ba7-089d-40a7-a0a7-43b3217a2527" containerName="marketplace-operator" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.709802 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb432ba7-089d-40a7-a0a7-43b3217a2527" containerName="marketplace-operator" Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.709861 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" containerName="registry-server" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.709995 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" containerName="registry-server" Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.710055 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698d3183-93e5-4693-8b24-cc507a41d274" containerName="registry-server" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.710126 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="698d3183-93e5-4693-8b24-cc507a41d274" containerName="registry-server" Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.710210 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1213b9a-e51e-4af9-835b-a39b5378ed60" containerName="extract-utilities" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.710262 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1213b9a-e51e-4af9-835b-a39b5378ed60" containerName="extract-utilities" Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.710314 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" containerName="extract-content" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.710605 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" containerName="extract-content" Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.710742 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d99f96-8f30-452b-9ca2-1f0c640380f8" containerName="registry-server" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.710802 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d99f96-8f30-452b-9ca2-1f0c640380f8" containerName="registry-server" Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.710854 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d99f96-8f30-452b-9ca2-1f0c640380f8" containerName="extract-content" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.710908 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d99f96-8f30-452b-9ca2-1f0c640380f8" containerName="extract-content" Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.710965 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1213b9a-e51e-4af9-835b-a39b5378ed60" containerName="extract-content" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.711019 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1213b9a-e51e-4af9-835b-a39b5378ed60" containerName="extract-content" Oct 04 02:44:42 crc kubenswrapper[4964]: E1004 02:44:42.711095 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698d3183-93e5-4693-8b24-cc507a41d274" containerName="extract-utilities" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.711149 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="698d3183-93e5-4693-8b24-cc507a41d274" containerName="extract-utilities" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.712098 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="698d3183-93e5-4693-8b24-cc507a41d274" containerName="registry-server" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.712156 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1213b9a-e51e-4af9-835b-a39b5378ed60" containerName="registry-server" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.712172 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d99f96-8f30-452b-9ca2-1f0c640380f8" containerName="registry-server" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.712197 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" containerName="registry-server" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.712219 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb432ba7-089d-40a7-a0a7-43b3217a2527" containerName="marketplace-operator" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.714709 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.716690 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.724991 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqv28"] Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.812478 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8wpr\" (UniqueName: \"kubernetes.io/projected/9e9d5421-37a5-4691-be42-0d69ce5c9150-kube-api-access-b8wpr\") pod \"redhat-marketplace-zqv28\" (UID: \"9e9d5421-37a5-4691-be42-0d69ce5c9150\") " pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.812564 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e9d5421-37a5-4691-be42-0d69ce5c9150-utilities\") pod \"redhat-marketplace-zqv28\" (UID: \"9e9d5421-37a5-4691-be42-0d69ce5c9150\") " pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.812595 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e9d5421-37a5-4691-be42-0d69ce5c9150-catalog-content\") pod \"redhat-marketplace-zqv28\" (UID: \"9e9d5421-37a5-4691-be42-0d69ce5c9150\") " pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.853755 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d99f96-8f30-452b-9ca2-1f0c640380f8" path="/var/lib/kubelet/pods/29d99f96-8f30-452b-9ca2-1f0c640380f8/volumes" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.854362 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698d3183-93e5-4693-8b24-cc507a41d274" path="/var/lib/kubelet/pods/698d3183-93e5-4693-8b24-cc507a41d274/volumes" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.854927 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb432ba7-089d-40a7-a0a7-43b3217a2527" path="/var/lib/kubelet/pods/bb432ba7-089d-40a7-a0a7-43b3217a2527/volumes" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.856172 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1213b9a-e51e-4af9-835b-a39b5378ed60" path="/var/lib/kubelet/pods/e1213b9a-e51e-4af9-835b-a39b5378ed60/volumes" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.857330 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e0aa21-f10c-449b-898a-6cb1e10c8bf6" path="/var/lib/kubelet/pods/e1e0aa21-f10c-449b-898a-6cb1e10c8bf6/volumes" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.913767 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8wpr\" (UniqueName: \"kubernetes.io/projected/9e9d5421-37a5-4691-be42-0d69ce5c9150-kube-api-access-b8wpr\") pod \"redhat-marketplace-zqv28\" (UID: \"9e9d5421-37a5-4691-be42-0d69ce5c9150\") " pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.913827 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e9d5421-37a5-4691-be42-0d69ce5c9150-utilities\") pod \"redhat-marketplace-zqv28\" (UID: \"9e9d5421-37a5-4691-be42-0d69ce5c9150\") " pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.913842 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e9d5421-37a5-4691-be42-0d69ce5c9150-catalog-content\") pod \"redhat-marketplace-zqv28\" (UID: \"9e9d5421-37a5-4691-be42-0d69ce5c9150\") " pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.914294 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e9d5421-37a5-4691-be42-0d69ce5c9150-catalog-content\") pod \"redhat-marketplace-zqv28\" (UID: \"9e9d5421-37a5-4691-be42-0d69ce5c9150\") " pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.914832 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e9d5421-37a5-4691-be42-0d69ce5c9150-utilities\") pod \"redhat-marketplace-zqv28\" (UID: \"9e9d5421-37a5-4691-be42-0d69ce5c9150\") " pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:42 crc kubenswrapper[4964]: I1004 02:44:42.960760 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8wpr\" (UniqueName: \"kubernetes.io/projected/9e9d5421-37a5-4691-be42-0d69ce5c9150-kube-api-access-b8wpr\") pod \"redhat-marketplace-zqv28\" (UID: \"9e9d5421-37a5-4691-be42-0d69ce5c9150\") " pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.033586 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.253380 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqv28"] Oct 04 02:44:43 crc kubenswrapper[4964]: W1004 02:44:43.264787 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e9d5421_37a5_4691_be42_0d69ce5c9150.slice/crio-abe80d1245e80d0bb98084409d460590c4651a88e23a462f2ec751e08acea241 WatchSource:0}: Error finding container abe80d1245e80d0bb98084409d460590c4651a88e23a462f2ec751e08acea241: Status 404 returned error can't find the container with id abe80d1245e80d0bb98084409d460590c4651a88e23a462f2ec751e08acea241 Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.309344 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rphwc"] Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.310539 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.312106 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.317032 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rphwc"] Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.419149 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpn5\" (UniqueName: \"kubernetes.io/projected/1fa77cb7-85eb-4961-b663-dc464f81426b-kube-api-access-9tpn5\") pod \"redhat-operators-rphwc\" (UID: \"1fa77cb7-85eb-4961-b663-dc464f81426b\") " pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.419194 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa77cb7-85eb-4961-b663-dc464f81426b-catalog-content\") pod \"redhat-operators-rphwc\" (UID: \"1fa77cb7-85eb-4961-b663-dc464f81426b\") " pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.419321 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa77cb7-85eb-4961-b663-dc464f81426b-utilities\") pod \"redhat-operators-rphwc\" (UID: \"1fa77cb7-85eb-4961-b663-dc464f81426b\") " pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.520855 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpn5\" (UniqueName: \"kubernetes.io/projected/1fa77cb7-85eb-4961-b663-dc464f81426b-kube-api-access-9tpn5\") pod \"redhat-operators-rphwc\" (UID: \"1fa77cb7-85eb-4961-b663-dc464f81426b\") " pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.520903 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa77cb7-85eb-4961-b663-dc464f81426b-catalog-content\") pod \"redhat-operators-rphwc\" (UID: \"1fa77cb7-85eb-4961-b663-dc464f81426b\") " pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.520940 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa77cb7-85eb-4961-b663-dc464f81426b-utilities\") pod \"redhat-operators-rphwc\" (UID: \"1fa77cb7-85eb-4961-b663-dc464f81426b\") " pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.521322 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa77cb7-85eb-4961-b663-dc464f81426b-utilities\") pod \"redhat-operators-rphwc\" (UID: \"1fa77cb7-85eb-4961-b663-dc464f81426b\") " pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.521445 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa77cb7-85eb-4961-b663-dc464f81426b-catalog-content\") pod \"redhat-operators-rphwc\" (UID: \"1fa77cb7-85eb-4961-b663-dc464f81426b\") " pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.540205 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpn5\" (UniqueName: \"kubernetes.io/projected/1fa77cb7-85eb-4961-b663-dc464f81426b-kube-api-access-9tpn5\") pod \"redhat-operators-rphwc\" (UID: \"1fa77cb7-85eb-4961-b663-dc464f81426b\") " pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.639432 4964 generic.go:334] "Generic (PLEG): container finished" podID="9e9d5421-37a5-4691-be42-0d69ce5c9150" containerID="abd79708ccc337ba15d49ec1313d92ed437bc6557d641859dc3a8ca0814d5688" exitCode=0 Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.639537 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqv28" event={"ID":"9e9d5421-37a5-4691-be42-0d69ce5c9150","Type":"ContainerDied","Data":"abd79708ccc337ba15d49ec1313d92ed437bc6557d641859dc3a8ca0814d5688"} Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.639599 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqv28" event={"ID":"9e9d5421-37a5-4691-be42-0d69ce5c9150","Type":"ContainerStarted","Data":"abe80d1245e80d0bb98084409d460590c4651a88e23a462f2ec751e08acea241"} Oct 04 02:44:43 crc kubenswrapper[4964]: I1004 02:44:43.654140 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:44 crc kubenswrapper[4964]: I1004 02:44:44.077546 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rphwc"] Oct 04 02:44:44 crc kubenswrapper[4964]: I1004 02:44:44.647700 4964 generic.go:334] "Generic (PLEG): container finished" podID="1fa77cb7-85eb-4961-b663-dc464f81426b" containerID="3a75cd5cae59d06fc82ed3181c7a57d6bac696c166c7a8985aad5a00c9485c2a" exitCode=0 Oct 04 02:44:44 crc kubenswrapper[4964]: I1004 02:44:44.647819 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rphwc" event={"ID":"1fa77cb7-85eb-4961-b663-dc464f81426b","Type":"ContainerDied","Data":"3a75cd5cae59d06fc82ed3181c7a57d6bac696c166c7a8985aad5a00c9485c2a"} Oct 04 02:44:44 crc kubenswrapper[4964]: I1004 02:44:44.648477 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rphwc" event={"ID":"1fa77cb7-85eb-4961-b663-dc464f81426b","Type":"ContainerStarted","Data":"e515304ab904ea649ffc7b47ce83f908da966c6aedc37106a4a031e57c3a015f"} Oct 04 02:44:44 crc kubenswrapper[4964]: I1004 02:44:44.651182 4964 generic.go:334] "Generic (PLEG): container finished" podID="9e9d5421-37a5-4691-be42-0d69ce5c9150" containerID="846215aaa480128932ec5ab9f168b14ab91d54b107087517ada91ba56ee391b3" exitCode=0 Oct 04 02:44:44 crc kubenswrapper[4964]: I1004 02:44:44.651241 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqv28" event={"ID":"9e9d5421-37a5-4691-be42-0d69ce5c9150","Type":"ContainerDied","Data":"846215aaa480128932ec5ab9f168b14ab91d54b107087517ada91ba56ee391b3"} Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.109536 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vvnjd"] Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.112937 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.115111 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.119456 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vvnjd"] Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.245345 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcda0354-8784-4a0e-86bc-b06464d136d9-utilities\") pod \"community-operators-vvnjd\" (UID: \"bcda0354-8784-4a0e-86bc-b06464d136d9\") " pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.245877 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhpd8\" (UniqueName: \"kubernetes.io/projected/bcda0354-8784-4a0e-86bc-b06464d136d9-kube-api-access-jhpd8\") pod \"community-operators-vvnjd\" (UID: \"bcda0354-8784-4a0e-86bc-b06464d136d9\") " pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.246042 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcda0354-8784-4a0e-86bc-b06464d136d9-catalog-content\") pod \"community-operators-vvnjd\" (UID: \"bcda0354-8784-4a0e-86bc-b06464d136d9\") " pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.347170 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcda0354-8784-4a0e-86bc-b06464d136d9-utilities\") pod \"community-operators-vvnjd\" (UID: \"bcda0354-8784-4a0e-86bc-b06464d136d9\") " pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.347225 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhpd8\" (UniqueName: \"kubernetes.io/projected/bcda0354-8784-4a0e-86bc-b06464d136d9-kube-api-access-jhpd8\") pod \"community-operators-vvnjd\" (UID: \"bcda0354-8784-4a0e-86bc-b06464d136d9\") " pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.347279 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcda0354-8784-4a0e-86bc-b06464d136d9-catalog-content\") pod \"community-operators-vvnjd\" (UID: \"bcda0354-8784-4a0e-86bc-b06464d136d9\") " pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.347730 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcda0354-8784-4a0e-86bc-b06464d136d9-utilities\") pod \"community-operators-vvnjd\" (UID: \"bcda0354-8784-4a0e-86bc-b06464d136d9\") " pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.347738 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcda0354-8784-4a0e-86bc-b06464d136d9-catalog-content\") pod \"community-operators-vvnjd\" (UID: \"bcda0354-8784-4a0e-86bc-b06464d136d9\") " pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.370441 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhpd8\" (UniqueName: \"kubernetes.io/projected/bcda0354-8784-4a0e-86bc-b06464d136d9-kube-api-access-jhpd8\") pod \"community-operators-vvnjd\" (UID: \"bcda0354-8784-4a0e-86bc-b06464d136d9\") " pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.445082 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.659451 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqv28" event={"ID":"9e9d5421-37a5-4691-be42-0d69ce5c9150","Type":"ContainerStarted","Data":"a8fcfbcd10303200c9e85490a83bdc5aadfe55177c9dfb405f82ec56944ac9e3"} Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.663685 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rphwc" event={"ID":"1fa77cb7-85eb-4961-b663-dc464f81426b","Type":"ContainerStarted","Data":"c12cec6966549d580f04e4c482105d51f920b50b80c1dadad72fd270b946f86b"} Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.693095 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zqv28" podStartSLOduration=2.302748925 podStartE2EDuration="3.693081694s" podCreationTimestamp="2025-10-04 02:44:42 +0000 UTC" firstStartedPulling="2025-10-04 02:44:43.642924896 +0000 UTC m=+263.539883534" lastFinishedPulling="2025-10-04 02:44:45.033257655 +0000 UTC m=+264.930216303" observedRunningTime="2025-10-04 02:44:45.678459432 +0000 UTC m=+265.575418070" watchObservedRunningTime="2025-10-04 02:44:45.693081694 +0000 UTC m=+265.590040332" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.708413 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7b6j9"] Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.709597 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.711510 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.719396 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7b6j9"] Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.853381 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wnc2\" (UniqueName: \"kubernetes.io/projected/63e7ee31-bc4a-4f08-bdec-51eb8f16be69-kube-api-access-4wnc2\") pod \"certified-operators-7b6j9\" (UID: \"63e7ee31-bc4a-4f08-bdec-51eb8f16be69\") " pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.853740 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e7ee31-bc4a-4f08-bdec-51eb8f16be69-catalog-content\") pod \"certified-operators-7b6j9\" (UID: \"63e7ee31-bc4a-4f08-bdec-51eb8f16be69\") " pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.853840 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e7ee31-bc4a-4f08-bdec-51eb8f16be69-utilities\") pod \"certified-operators-7b6j9\" (UID: \"63e7ee31-bc4a-4f08-bdec-51eb8f16be69\") " pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.902039 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vvnjd"] Oct 04 02:44:45 crc kubenswrapper[4964]: W1004 02:44:45.914395 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcda0354_8784_4a0e_86bc_b06464d136d9.slice/crio-99b4345f0bc4d2c3c1c653ff5c41c21cb41f45fee0aeae856ef696adfe8f4049 WatchSource:0}: Error finding container 99b4345f0bc4d2c3c1c653ff5c41c21cb41f45fee0aeae856ef696adfe8f4049: Status 404 returned error can't find the container with id 99b4345f0bc4d2c3c1c653ff5c41c21cb41f45fee0aeae856ef696adfe8f4049 Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.955174 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wnc2\" (UniqueName: \"kubernetes.io/projected/63e7ee31-bc4a-4f08-bdec-51eb8f16be69-kube-api-access-4wnc2\") pod \"certified-operators-7b6j9\" (UID: \"63e7ee31-bc4a-4f08-bdec-51eb8f16be69\") " pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.955292 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e7ee31-bc4a-4f08-bdec-51eb8f16be69-catalog-content\") pod \"certified-operators-7b6j9\" (UID: \"63e7ee31-bc4a-4f08-bdec-51eb8f16be69\") " pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.955368 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e7ee31-bc4a-4f08-bdec-51eb8f16be69-utilities\") pod \"certified-operators-7b6j9\" (UID: \"63e7ee31-bc4a-4f08-bdec-51eb8f16be69\") " pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.955884 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e7ee31-bc4a-4f08-bdec-51eb8f16be69-utilities\") pod \"certified-operators-7b6j9\" (UID: \"63e7ee31-bc4a-4f08-bdec-51eb8f16be69\") " pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.956092 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e7ee31-bc4a-4f08-bdec-51eb8f16be69-catalog-content\") pod \"certified-operators-7b6j9\" (UID: \"63e7ee31-bc4a-4f08-bdec-51eb8f16be69\") " pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:45 crc kubenswrapper[4964]: I1004 02:44:45.972241 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wnc2\" (UniqueName: \"kubernetes.io/projected/63e7ee31-bc4a-4f08-bdec-51eb8f16be69-kube-api-access-4wnc2\") pod \"certified-operators-7b6j9\" (UID: \"63e7ee31-bc4a-4f08-bdec-51eb8f16be69\") " pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:46 crc kubenswrapper[4964]: I1004 02:44:46.023366 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:46 crc kubenswrapper[4964]: I1004 02:44:46.422086 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7b6j9"] Oct 04 02:44:46 crc kubenswrapper[4964]: W1004 02:44:46.429773 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63e7ee31_bc4a_4f08_bdec_51eb8f16be69.slice/crio-2b8d2db6fd5e8cefecbcc751618ae3aa6065f8021a62e9f30b1cdbc4d18738c0 WatchSource:0}: Error finding container 2b8d2db6fd5e8cefecbcc751618ae3aa6065f8021a62e9f30b1cdbc4d18738c0: Status 404 returned error can't find the container with id 2b8d2db6fd5e8cefecbcc751618ae3aa6065f8021a62e9f30b1cdbc4d18738c0 Oct 04 02:44:46 crc kubenswrapper[4964]: I1004 02:44:46.670731 4964 generic.go:334] "Generic (PLEG): container finished" podID="bcda0354-8784-4a0e-86bc-b06464d136d9" containerID="8b38d29dfac1a71962737abe761451885df3c0dc33e8702341ef37680f5b23cb" exitCode=0 Oct 04 02:44:46 crc kubenswrapper[4964]: I1004 02:44:46.670828 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvnjd" event={"ID":"bcda0354-8784-4a0e-86bc-b06464d136d9","Type":"ContainerDied","Data":"8b38d29dfac1a71962737abe761451885df3c0dc33e8702341ef37680f5b23cb"} Oct 04 02:44:46 crc kubenswrapper[4964]: I1004 02:44:46.671135 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvnjd" event={"ID":"bcda0354-8784-4a0e-86bc-b06464d136d9","Type":"ContainerStarted","Data":"99b4345f0bc4d2c3c1c653ff5c41c21cb41f45fee0aeae856ef696adfe8f4049"} Oct 04 02:44:46 crc kubenswrapper[4964]: I1004 02:44:46.673141 4964 generic.go:334] "Generic (PLEG): container finished" podID="1fa77cb7-85eb-4961-b663-dc464f81426b" containerID="c12cec6966549d580f04e4c482105d51f920b50b80c1dadad72fd270b946f86b" exitCode=0 Oct 04 02:44:46 crc kubenswrapper[4964]: I1004 02:44:46.673250 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rphwc" event={"ID":"1fa77cb7-85eb-4961-b663-dc464f81426b","Type":"ContainerDied","Data":"c12cec6966549d580f04e4c482105d51f920b50b80c1dadad72fd270b946f86b"} Oct 04 02:44:46 crc kubenswrapper[4964]: I1004 02:44:46.675919 4964 generic.go:334] "Generic (PLEG): container finished" podID="63e7ee31-bc4a-4f08-bdec-51eb8f16be69" containerID="fa81c489191144b9a334eb630b0d7a83665179ad97c6347cc1af1c792a9e4ff3" exitCode=0 Oct 04 02:44:46 crc kubenswrapper[4964]: I1004 02:44:46.676036 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7b6j9" event={"ID":"63e7ee31-bc4a-4f08-bdec-51eb8f16be69","Type":"ContainerDied","Data":"fa81c489191144b9a334eb630b0d7a83665179ad97c6347cc1af1c792a9e4ff3"} Oct 04 02:44:46 crc kubenswrapper[4964]: I1004 02:44:46.676117 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7b6j9" event={"ID":"63e7ee31-bc4a-4f08-bdec-51eb8f16be69","Type":"ContainerStarted","Data":"2b8d2db6fd5e8cefecbcc751618ae3aa6065f8021a62e9f30b1cdbc4d18738c0"} Oct 04 02:44:47 crc kubenswrapper[4964]: I1004 02:44:47.684686 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvnjd" event={"ID":"bcda0354-8784-4a0e-86bc-b06464d136d9","Type":"ContainerStarted","Data":"402d94adbc56f82b4ff846bdc09283faeb426dcaec1806be8df09ec56ba94686"} Oct 04 02:44:47 crc kubenswrapper[4964]: I1004 02:44:47.686998 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rphwc" event={"ID":"1fa77cb7-85eb-4961-b663-dc464f81426b","Type":"ContainerStarted","Data":"0c0f97b21016af2675e9d5e50b6f43d95660669c94927d2004abbc67ec625828"} Oct 04 02:44:47 crc kubenswrapper[4964]: I1004 02:44:47.688597 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7b6j9" event={"ID":"63e7ee31-bc4a-4f08-bdec-51eb8f16be69","Type":"ContainerStarted","Data":"72d21a635c056229db52788e81474255ae33df82587406d34c34df4cde7b0160"} Oct 04 02:44:47 crc kubenswrapper[4964]: I1004 02:44:47.720663 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rphwc" podStartSLOduration=2.20523846 podStartE2EDuration="4.72064771s" podCreationTimestamp="2025-10-04 02:44:43 +0000 UTC" firstStartedPulling="2025-10-04 02:44:44.649537932 +0000 UTC m=+264.546496570" lastFinishedPulling="2025-10-04 02:44:47.164947192 +0000 UTC m=+267.061905820" observedRunningTime="2025-10-04 02:44:47.718252134 +0000 UTC m=+267.615210772" watchObservedRunningTime="2025-10-04 02:44:47.72064771 +0000 UTC m=+267.617606348" Oct 04 02:44:48 crc kubenswrapper[4964]: I1004 02:44:48.695851 4964 generic.go:334] "Generic (PLEG): container finished" podID="63e7ee31-bc4a-4f08-bdec-51eb8f16be69" containerID="72d21a635c056229db52788e81474255ae33df82587406d34c34df4cde7b0160" exitCode=0 Oct 04 02:44:48 crc kubenswrapper[4964]: I1004 02:44:48.696086 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7b6j9" event={"ID":"63e7ee31-bc4a-4f08-bdec-51eb8f16be69","Type":"ContainerDied","Data":"72d21a635c056229db52788e81474255ae33df82587406d34c34df4cde7b0160"} Oct 04 02:44:48 crc kubenswrapper[4964]: I1004 02:44:48.699926 4964 generic.go:334] "Generic (PLEG): container finished" podID="bcda0354-8784-4a0e-86bc-b06464d136d9" containerID="402d94adbc56f82b4ff846bdc09283faeb426dcaec1806be8df09ec56ba94686" exitCode=0 Oct 04 02:44:48 crc kubenswrapper[4964]: I1004 02:44:48.699989 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvnjd" event={"ID":"bcda0354-8784-4a0e-86bc-b06464d136d9","Type":"ContainerDied","Data":"402d94adbc56f82b4ff846bdc09283faeb426dcaec1806be8df09ec56ba94686"} Oct 04 02:44:49 crc kubenswrapper[4964]: I1004 02:44:49.707917 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvnjd" event={"ID":"bcda0354-8784-4a0e-86bc-b06464d136d9","Type":"ContainerStarted","Data":"b802fe4f3dffc11e37e07dbbc80cc46449cc6653c72d01ede49567bd289cc6c2"} Oct 04 02:44:49 crc kubenswrapper[4964]: I1004 02:44:49.710458 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7b6j9" event={"ID":"63e7ee31-bc4a-4f08-bdec-51eb8f16be69","Type":"ContainerStarted","Data":"2d576e6dd6d0355946d1b5a21fdc7c1d6ec421ad4552e169a5df2ce11864f5c2"} Oct 04 02:44:49 crc kubenswrapper[4964]: I1004 02:44:49.732121 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vvnjd" podStartSLOduration=2.317402751 podStartE2EDuration="4.732099544s" podCreationTimestamp="2025-10-04 02:44:45 +0000 UTC" firstStartedPulling="2025-10-04 02:44:46.672546714 +0000 UTC m=+266.569505382" lastFinishedPulling="2025-10-04 02:44:49.087243527 +0000 UTC m=+268.984202175" observedRunningTime="2025-10-04 02:44:49.726346906 +0000 UTC m=+269.623305554" watchObservedRunningTime="2025-10-04 02:44:49.732099544 +0000 UTC m=+269.629058182" Oct 04 02:44:49 crc kubenswrapper[4964]: I1004 02:44:49.747905 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7b6j9" podStartSLOduration=2.275218961 podStartE2EDuration="4.747888437s" podCreationTimestamp="2025-10-04 02:44:45 +0000 UTC" firstStartedPulling="2025-10-04 02:44:46.677545371 +0000 UTC m=+266.574504009" lastFinishedPulling="2025-10-04 02:44:49.150214847 +0000 UTC m=+269.047173485" observedRunningTime="2025-10-04 02:44:49.745649356 +0000 UTC m=+269.642608004" watchObservedRunningTime="2025-10-04 02:44:49.747888437 +0000 UTC m=+269.644847075" Oct 04 02:44:53 crc kubenswrapper[4964]: I1004 02:44:53.034224 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:53 crc kubenswrapper[4964]: I1004 02:44:53.034293 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:53 crc kubenswrapper[4964]: I1004 02:44:53.075904 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:53 crc kubenswrapper[4964]: I1004 02:44:53.655218 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:53 crc kubenswrapper[4964]: I1004 02:44:53.655529 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:53 crc kubenswrapper[4964]: I1004 02:44:53.738769 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:53 crc kubenswrapper[4964]: I1004 02:44:53.792690 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zqv28" Oct 04 02:44:53 crc kubenswrapper[4964]: I1004 02:44:53.798172 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rphwc" Oct 04 02:44:55 crc kubenswrapper[4964]: I1004 02:44:55.445437 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:55 crc kubenswrapper[4964]: I1004 02:44:55.445493 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:55 crc kubenswrapper[4964]: I1004 02:44:55.511009 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:55 crc kubenswrapper[4964]: I1004 02:44:55.796247 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vvnjd" Oct 04 02:44:56 crc kubenswrapper[4964]: I1004 02:44:56.023510 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:56 crc kubenswrapper[4964]: I1004 02:44:56.023564 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:56 crc kubenswrapper[4964]: I1004 02:44:56.065062 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:44:56 crc kubenswrapper[4964]: I1004 02:44:56.794590 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7b6j9" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.122900 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67"] Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.123550 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.125292 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.127403 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.129211 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67"] Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.257597 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-secret-volume\") pod \"collect-profiles-29325765-t4h67\" (UID: \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.257733 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-config-volume\") pod \"collect-profiles-29325765-t4h67\" (UID: \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.257758 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcdr5\" (UniqueName: \"kubernetes.io/projected/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-kube-api-access-rcdr5\") pod \"collect-profiles-29325765-t4h67\" (UID: \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.358691 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-config-volume\") pod \"collect-profiles-29325765-t4h67\" (UID: \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.358736 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcdr5\" (UniqueName: \"kubernetes.io/projected/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-kube-api-access-rcdr5\") pod \"collect-profiles-29325765-t4h67\" (UID: \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.358782 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-secret-volume\") pod \"collect-profiles-29325765-t4h67\" (UID: \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.359837 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-config-volume\") pod \"collect-profiles-29325765-t4h67\" (UID: \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.368487 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-secret-volume\") pod \"collect-profiles-29325765-t4h67\" (UID: \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.387934 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcdr5\" (UniqueName: \"kubernetes.io/projected/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-kube-api-access-rcdr5\") pod \"collect-profiles-29325765-t4h67\" (UID: \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.438159 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:00 crc kubenswrapper[4964]: I1004 02:45:00.880673 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67"] Oct 04 02:45:01 crc kubenswrapper[4964]: I1004 02:45:01.780837 4964 generic.go:334] "Generic (PLEG): container finished" podID="5c9243f1-7ba9-4c8d-bcc8-f90c3104852b" containerID="385ada5cd8db23138e0663e82a5fdadb43dff2f7b9bc0d0bc4106a7cda2d7ceb" exitCode=0 Oct 04 02:45:01 crc kubenswrapper[4964]: I1004 02:45:01.780902 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" event={"ID":"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b","Type":"ContainerDied","Data":"385ada5cd8db23138e0663e82a5fdadb43dff2f7b9bc0d0bc4106a7cda2d7ceb"} Oct 04 02:45:01 crc kubenswrapper[4964]: I1004 02:45:01.781205 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" event={"ID":"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b","Type":"ContainerStarted","Data":"d93560df3c53f4be80843811765fa8facd9515d1c4df556fa95f4da5dd56ed62"} Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.145302 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.306350 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcdr5\" (UniqueName: \"kubernetes.io/projected/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-kube-api-access-rcdr5\") pod \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\" (UID: \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\") " Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.306511 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-config-volume\") pod \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\" (UID: \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\") " Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.306561 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-secret-volume\") pod \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\" (UID: \"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b\") " Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.307829 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-config-volume" (OuterVolumeSpecName: "config-volume") pod "5c9243f1-7ba9-4c8d-bcc8-f90c3104852b" (UID: "5c9243f1-7ba9-4c8d-bcc8-f90c3104852b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.312528 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-kube-api-access-rcdr5" (OuterVolumeSpecName: "kube-api-access-rcdr5") pod "5c9243f1-7ba9-4c8d-bcc8-f90c3104852b" (UID: "5c9243f1-7ba9-4c8d-bcc8-f90c3104852b"). InnerVolumeSpecName "kube-api-access-rcdr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.313144 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5c9243f1-7ba9-4c8d-bcc8-f90c3104852b" (UID: "5c9243f1-7ba9-4c8d-bcc8-f90c3104852b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.407598 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcdr5\" (UniqueName: \"kubernetes.io/projected/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-kube-api-access-rcdr5\") on node \"crc\" DevicePath \"\"" Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.407671 4964 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.407680 4964 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.797959 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.797895 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67" event={"ID":"5c9243f1-7ba9-4c8d-bcc8-f90c3104852b","Type":"ContainerDied","Data":"d93560df3c53f4be80843811765fa8facd9515d1c4df556fa95f4da5dd56ed62"} Oct 04 02:45:03 crc kubenswrapper[4964]: I1004 02:45:03.798520 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d93560df3c53f4be80843811765fa8facd9515d1c4df556fa95f4da5dd56ed62" Oct 04 02:46:34 crc kubenswrapper[4964]: I1004 02:46:34.449026 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:46:34 crc kubenswrapper[4964]: I1004 02:46:34.449838 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:47:04 crc kubenswrapper[4964]: I1004 02:47:04.448576 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:47:04 crc kubenswrapper[4964]: I1004 02:47:04.449219 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.726673 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ts46m"] Oct 04 02:47:23 crc kubenswrapper[4964]: E1004 02:47:23.727559 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9243f1-7ba9-4c8d-bcc8-f90c3104852b" containerName="collect-profiles" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.727580 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9243f1-7ba9-4c8d-bcc8-f90c3104852b" containerName="collect-profiles" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.727815 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9243f1-7ba9-4c8d-bcc8-f90c3104852b" containerName="collect-profiles" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.728275 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.755108 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ts46m"] Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.824986 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.825301 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mlzw\" (UniqueName: \"kubernetes.io/projected/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-kube-api-access-7mlzw\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.825328 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-trusted-ca\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.825354 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.825385 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-bound-sa-token\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.825419 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.825444 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-registry-certificates\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.825478 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-registry-tls\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.853931 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.926971 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.927026 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-registry-certificates\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.927072 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-registry-tls\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.927128 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mlzw\" (UniqueName: \"kubernetes.io/projected/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-kube-api-access-7mlzw\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.927148 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-trusted-ca\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.927177 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.927210 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-bound-sa-token\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.928280 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.929660 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-registry-certificates\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.929958 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-trusted-ca\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.935108 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.935595 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-registry-tls\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.945851 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-bound-sa-token\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:23 crc kubenswrapper[4964]: I1004 02:47:23.950860 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mlzw\" (UniqueName: \"kubernetes.io/projected/99f8e2d0-e43c-4a2a-9a4d-4382496ed1af-kube-api-access-7mlzw\") pod \"image-registry-66df7c8f76-ts46m\" (UID: \"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af\") " pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:24 crc kubenswrapper[4964]: I1004 02:47:24.047969 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:24 crc kubenswrapper[4964]: I1004 02:47:24.327733 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ts46m"] Oct 04 02:47:24 crc kubenswrapper[4964]: W1004 02:47:24.334009 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99f8e2d0_e43c_4a2a_9a4d_4382496ed1af.slice/crio-aff878a2edb37f83113f1269defb36598e18d3a39ac5d6307cf8d77dd45152ff WatchSource:0}: Error finding container aff878a2edb37f83113f1269defb36598e18d3a39ac5d6307cf8d77dd45152ff: Status 404 returned error can't find the container with id aff878a2edb37f83113f1269defb36598e18d3a39ac5d6307cf8d77dd45152ff Oct 04 02:47:24 crc kubenswrapper[4964]: I1004 02:47:24.694997 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" event={"ID":"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af","Type":"ContainerStarted","Data":"858605ade4e21464cf6a633185364f7b5a6105f750356d20d9ccf9f22810ca7b"} Oct 04 02:47:24 crc kubenswrapper[4964]: I1004 02:47:24.695078 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" event={"ID":"99f8e2d0-e43c-4a2a-9a4d-4382496ed1af","Type":"ContainerStarted","Data":"aff878a2edb37f83113f1269defb36598e18d3a39ac5d6307cf8d77dd45152ff"} Oct 04 02:47:24 crc kubenswrapper[4964]: I1004 02:47:24.695246 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:24 crc kubenswrapper[4964]: I1004 02:47:24.722688 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" podStartSLOduration=1.722661306 podStartE2EDuration="1.722661306s" podCreationTimestamp="2025-10-04 02:47:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:47:24.720416303 +0000 UTC m=+424.617374971" watchObservedRunningTime="2025-10-04 02:47:24.722661306 +0000 UTC m=+424.619619974" Oct 04 02:47:34 crc kubenswrapper[4964]: I1004 02:47:34.449356 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:47:34 crc kubenswrapper[4964]: I1004 02:47:34.450112 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:47:34 crc kubenswrapper[4964]: I1004 02:47:34.450211 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:47:34 crc kubenswrapper[4964]: I1004 02:47:34.451127 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca513d76044d84d58871dc80cf9d1dc2e3bdff83478b7916d3faa2f268b14909"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 02:47:34 crc kubenswrapper[4964]: I1004 02:47:34.451225 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://ca513d76044d84d58871dc80cf9d1dc2e3bdff83478b7916d3faa2f268b14909" gracePeriod=600 Oct 04 02:47:34 crc kubenswrapper[4964]: I1004 02:47:34.780864 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="ca513d76044d84d58871dc80cf9d1dc2e3bdff83478b7916d3faa2f268b14909" exitCode=0 Oct 04 02:47:34 crc kubenswrapper[4964]: I1004 02:47:34.780991 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"ca513d76044d84d58871dc80cf9d1dc2e3bdff83478b7916d3faa2f268b14909"} Oct 04 02:47:34 crc kubenswrapper[4964]: I1004 02:47:34.781425 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"1137f47c0aa6462aae7846003655bf95319654803dc6e0eaf3afe56be26f6389"} Oct 04 02:47:34 crc kubenswrapper[4964]: I1004 02:47:34.781465 4964 scope.go:117] "RemoveContainer" containerID="6a8dddba9e070bd6be4c1856646edbaed81db96c4c671f5fabe8330743fe4387" Oct 04 02:47:44 crc kubenswrapper[4964]: I1004 02:47:44.055268 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-ts46m" Oct 04 02:47:44 crc kubenswrapper[4964]: I1004 02:47:44.126129 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wppjk"] Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.178330 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" podUID="15ca5de7-b5aa-4d86-82b5-122f22b494ee" containerName="registry" containerID="cri-o://391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6" gracePeriod=30 Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.621716 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.697951 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15ca5de7-b5aa-4d86-82b5-122f22b494ee-installation-pull-secrets\") pod \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.698051 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15ca5de7-b5aa-4d86-82b5-122f22b494ee-trusted-ca\") pod \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.698149 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5vh7\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-kube-api-access-d5vh7\") pod \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.698310 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15ca5de7-b5aa-4d86-82b5-122f22b494ee-ca-trust-extracted\") pod \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.698383 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15ca5de7-b5aa-4d86-82b5-122f22b494ee-registry-certificates\") pod \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.698441 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-bound-sa-token\") pod \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.698491 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-registry-tls\") pod \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.698760 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\" (UID: \"15ca5de7-b5aa-4d86-82b5-122f22b494ee\") " Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.699654 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ca5de7-b5aa-4d86-82b5-122f22b494ee-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "15ca5de7-b5aa-4d86-82b5-122f22b494ee" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.700213 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ca5de7-b5aa-4d86-82b5-122f22b494ee-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "15ca5de7-b5aa-4d86-82b5-122f22b494ee" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.708170 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "15ca5de7-b5aa-4d86-82b5-122f22b494ee" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.708740 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-kube-api-access-d5vh7" (OuterVolumeSpecName: "kube-api-access-d5vh7") pod "15ca5de7-b5aa-4d86-82b5-122f22b494ee" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee"). InnerVolumeSpecName "kube-api-access-d5vh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.711108 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ca5de7-b5aa-4d86-82b5-122f22b494ee-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "15ca5de7-b5aa-4d86-82b5-122f22b494ee" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.714136 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "15ca5de7-b5aa-4d86-82b5-122f22b494ee" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.719923 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "15ca5de7-b5aa-4d86-82b5-122f22b494ee" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.723229 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15ca5de7-b5aa-4d86-82b5-122f22b494ee-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "15ca5de7-b5aa-4d86-82b5-122f22b494ee" (UID: "15ca5de7-b5aa-4d86-82b5-122f22b494ee"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.800932 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5vh7\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-kube-api-access-d5vh7\") on node \"crc\" DevicePath \"\"" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.800974 4964 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15ca5de7-b5aa-4d86-82b5-122f22b494ee-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.800990 4964 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15ca5de7-b5aa-4d86-82b5-122f22b494ee-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.801004 4964 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.801018 4964 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15ca5de7-b5aa-4d86-82b5-122f22b494ee-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.801033 4964 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15ca5de7-b5aa-4d86-82b5-122f22b494ee-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 04 02:48:09 crc kubenswrapper[4964]: I1004 02:48:09.801046 4964 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15ca5de7-b5aa-4d86-82b5-122f22b494ee-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:48:10 crc kubenswrapper[4964]: I1004 02:48:10.062562 4964 generic.go:334] "Generic (PLEG): container finished" podID="15ca5de7-b5aa-4d86-82b5-122f22b494ee" containerID="391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6" exitCode=0 Oct 04 02:48:10 crc kubenswrapper[4964]: I1004 02:48:10.062675 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" event={"ID":"15ca5de7-b5aa-4d86-82b5-122f22b494ee","Type":"ContainerDied","Data":"391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6"} Oct 04 02:48:10 crc kubenswrapper[4964]: I1004 02:48:10.062688 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" Oct 04 02:48:10 crc kubenswrapper[4964]: I1004 02:48:10.062731 4964 scope.go:117] "RemoveContainer" containerID="391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6" Oct 04 02:48:10 crc kubenswrapper[4964]: I1004 02:48:10.062713 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wppjk" event={"ID":"15ca5de7-b5aa-4d86-82b5-122f22b494ee","Type":"ContainerDied","Data":"197680c3c569be03412cf4593f84d7170d199e46ce7d6517d9503cabd51933eb"} Oct 04 02:48:10 crc kubenswrapper[4964]: I1004 02:48:10.092272 4964 scope.go:117] "RemoveContainer" containerID="391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6" Oct 04 02:48:10 crc kubenswrapper[4964]: E1004 02:48:10.092998 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6\": container with ID starting with 391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6 not found: ID does not exist" containerID="391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6" Oct 04 02:48:10 crc kubenswrapper[4964]: I1004 02:48:10.093056 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6"} err="failed to get container status \"391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6\": rpc error: code = NotFound desc = could not find container \"391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6\": container with ID starting with 391178478c5a5a68d3a0ccf737524e75d080cf0cea853142ff842883deeb3fa6 not found: ID does not exist" Oct 04 02:48:10 crc kubenswrapper[4964]: I1004 02:48:10.111233 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wppjk"] Oct 04 02:48:10 crc kubenswrapper[4964]: I1004 02:48:10.117872 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wppjk"] Oct 04 02:48:10 crc kubenswrapper[4964]: I1004 02:48:10.855973 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ca5de7-b5aa-4d86-82b5-122f22b494ee" path="/var/lib/kubelet/pods/15ca5de7-b5aa-4d86-82b5-122f22b494ee/volumes" Oct 04 02:49:34 crc kubenswrapper[4964]: I1004 02:49:34.449744 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:49:34 crc kubenswrapper[4964]: I1004 02:49:34.450327 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.698757 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zlb46"] Oct 04 02:49:54 crc kubenswrapper[4964]: E1004 02:49:54.699708 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ca5de7-b5aa-4d86-82b5-122f22b494ee" containerName="registry" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.699721 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ca5de7-b5aa-4d86-82b5-122f22b494ee" containerName="registry" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.699806 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ca5de7-b5aa-4d86-82b5-122f22b494ee" containerName="registry" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.700134 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-zlb46" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.702523 4964 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-sndrs" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.704569 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.705269 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.712649 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vkqh4"] Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.713437 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-vkqh4" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.714977 4964 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bvckr" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.725709 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dksdz"] Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.726405 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-dksdz" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.727692 4964 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-5zkmv" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.733132 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vkqh4"] Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.754301 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dksdz"] Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.756858 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zlb46"] Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.866281 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmzpl\" (UniqueName: \"kubernetes.io/projected/921b427a-cfcd-4125-aab7-e1c073058743-kube-api-access-lmzpl\") pod \"cert-manager-webhook-5655c58dd6-dksdz\" (UID: \"921b427a-cfcd-4125-aab7-e1c073058743\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dksdz" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.866357 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx75c\" (UniqueName: \"kubernetes.io/projected/454987ea-9834-46bc-b79a-ba124a2a44ed-kube-api-access-rx75c\") pod \"cert-manager-cainjector-7f985d654d-zlb46\" (UID: \"454987ea-9834-46bc-b79a-ba124a2a44ed\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zlb46" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.866488 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz4lv\" (UniqueName: \"kubernetes.io/projected/94a658f9-597a-4377-a7a2-b5b46e1fe345-kube-api-access-cz4lv\") pod \"cert-manager-5b446d88c5-vkqh4\" (UID: \"94a658f9-597a-4377-a7a2-b5b46e1fe345\") " pod="cert-manager/cert-manager-5b446d88c5-vkqh4" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.967501 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmzpl\" (UniqueName: \"kubernetes.io/projected/921b427a-cfcd-4125-aab7-e1c073058743-kube-api-access-lmzpl\") pod \"cert-manager-webhook-5655c58dd6-dksdz\" (UID: \"921b427a-cfcd-4125-aab7-e1c073058743\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dksdz" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.967821 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx75c\" (UniqueName: \"kubernetes.io/projected/454987ea-9834-46bc-b79a-ba124a2a44ed-kube-api-access-rx75c\") pod \"cert-manager-cainjector-7f985d654d-zlb46\" (UID: \"454987ea-9834-46bc-b79a-ba124a2a44ed\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zlb46" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.967880 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz4lv\" (UniqueName: \"kubernetes.io/projected/94a658f9-597a-4377-a7a2-b5b46e1fe345-kube-api-access-cz4lv\") pod \"cert-manager-5b446d88c5-vkqh4\" (UID: \"94a658f9-597a-4377-a7a2-b5b46e1fe345\") " pod="cert-manager/cert-manager-5b446d88c5-vkqh4" Oct 04 02:49:54 crc kubenswrapper[4964]: I1004 02:49:54.990212 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmzpl\" (UniqueName: \"kubernetes.io/projected/921b427a-cfcd-4125-aab7-e1c073058743-kube-api-access-lmzpl\") pod \"cert-manager-webhook-5655c58dd6-dksdz\" (UID: \"921b427a-cfcd-4125-aab7-e1c073058743\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dksdz" Oct 04 02:49:55 crc kubenswrapper[4964]: I1004 02:49:55.010642 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz4lv\" (UniqueName: \"kubernetes.io/projected/94a658f9-597a-4377-a7a2-b5b46e1fe345-kube-api-access-cz4lv\") pod \"cert-manager-5b446d88c5-vkqh4\" (UID: \"94a658f9-597a-4377-a7a2-b5b46e1fe345\") " pod="cert-manager/cert-manager-5b446d88c5-vkqh4" Oct 04 02:49:55 crc kubenswrapper[4964]: I1004 02:49:55.010806 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx75c\" (UniqueName: \"kubernetes.io/projected/454987ea-9834-46bc-b79a-ba124a2a44ed-kube-api-access-rx75c\") pod \"cert-manager-cainjector-7f985d654d-zlb46\" (UID: \"454987ea-9834-46bc-b79a-ba124a2a44ed\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zlb46" Oct 04 02:49:55 crc kubenswrapper[4964]: I1004 02:49:55.026229 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-zlb46" Oct 04 02:49:55 crc kubenswrapper[4964]: I1004 02:49:55.059914 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-vkqh4" Oct 04 02:49:55 crc kubenswrapper[4964]: I1004 02:49:55.073717 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-dksdz" Oct 04 02:49:55 crc kubenswrapper[4964]: I1004 02:49:55.260739 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zlb46"] Oct 04 02:49:55 crc kubenswrapper[4964]: I1004 02:49:55.270970 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 02:49:55 crc kubenswrapper[4964]: I1004 02:49:55.526570 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dksdz"] Oct 04 02:49:55 crc kubenswrapper[4964]: I1004 02:49:55.534574 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vkqh4"] Oct 04 02:49:55 crc kubenswrapper[4964]: W1004 02:49:55.537516 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod921b427a_cfcd_4125_aab7_e1c073058743.slice/crio-e6bbc10aff69bb7a50bc55d6de1a432f21481366235546181d3b0f906e54224d WatchSource:0}: Error finding container e6bbc10aff69bb7a50bc55d6de1a432f21481366235546181d3b0f906e54224d: Status 404 returned error can't find the container with id e6bbc10aff69bb7a50bc55d6de1a432f21481366235546181d3b0f906e54224d Oct 04 02:49:55 crc kubenswrapper[4964]: W1004 02:49:55.553609 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94a658f9_597a_4377_a7a2_b5b46e1fe345.slice/crio-99976141cb44c2672bce729eec343c39f396b875973dfac2b618dfef3f62e99c WatchSource:0}: Error finding container 99976141cb44c2672bce729eec343c39f396b875973dfac2b618dfef3f62e99c: Status 404 returned error can't find the container with id 99976141cb44c2672bce729eec343c39f396b875973dfac2b618dfef3f62e99c Oct 04 02:49:55 crc kubenswrapper[4964]: I1004 02:49:55.707028 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-dksdz" event={"ID":"921b427a-cfcd-4125-aab7-e1c073058743","Type":"ContainerStarted","Data":"e6bbc10aff69bb7a50bc55d6de1a432f21481366235546181d3b0f906e54224d"} Oct 04 02:49:55 crc kubenswrapper[4964]: I1004 02:49:55.708390 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-zlb46" event={"ID":"454987ea-9834-46bc-b79a-ba124a2a44ed","Type":"ContainerStarted","Data":"27a34eab8caa1fc0750f938dce476911808c0c6dbf3d65b59c2db257e1aa050a"} Oct 04 02:49:55 crc kubenswrapper[4964]: I1004 02:49:55.709894 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-vkqh4" event={"ID":"94a658f9-597a-4377-a7a2-b5b46e1fe345","Type":"ContainerStarted","Data":"99976141cb44c2672bce729eec343c39f396b875973dfac2b618dfef3f62e99c"} Oct 04 02:49:57 crc kubenswrapper[4964]: I1004 02:49:57.723036 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-zlb46" event={"ID":"454987ea-9834-46bc-b79a-ba124a2a44ed","Type":"ContainerStarted","Data":"caad44fa54015426f7c15e0de367d884918e7e54ac577b58362f6002cf3f21f3"} Oct 04 02:49:57 crc kubenswrapper[4964]: I1004 02:49:57.738041 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-zlb46" podStartSLOduration=1.862945458 podStartE2EDuration="3.738025662s" podCreationTimestamp="2025-10-04 02:49:54 +0000 UTC" firstStartedPulling="2025-10-04 02:49:55.270744195 +0000 UTC m=+575.167702833" lastFinishedPulling="2025-10-04 02:49:57.145824369 +0000 UTC m=+577.042783037" observedRunningTime="2025-10-04 02:49:57.734973889 +0000 UTC m=+577.631932537" watchObservedRunningTime="2025-10-04 02:49:57.738025662 +0000 UTC m=+577.634984300" Oct 04 02:49:59 crc kubenswrapper[4964]: I1004 02:49:59.747869 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-vkqh4" event={"ID":"94a658f9-597a-4377-a7a2-b5b46e1fe345","Type":"ContainerStarted","Data":"06d1026384c40b2fa351a965695a615f66e3dd477416f94c177d52185f0db20b"} Oct 04 02:49:59 crc kubenswrapper[4964]: I1004 02:49:59.754000 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-dksdz" event={"ID":"921b427a-cfcd-4125-aab7-e1c073058743","Type":"ContainerStarted","Data":"fccb201016d07c6f7f52dfdc71c4224d6b7d89e3aa9e2eb77ecf8da28214398e"} Oct 04 02:49:59 crc kubenswrapper[4964]: I1004 02:49:59.754132 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-dksdz" Oct 04 02:49:59 crc kubenswrapper[4964]: I1004 02:49:59.773503 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-vkqh4" podStartSLOduration=2.479642572 podStartE2EDuration="5.773471212s" podCreationTimestamp="2025-10-04 02:49:54 +0000 UTC" firstStartedPulling="2025-10-04 02:49:55.559971424 +0000 UTC m=+575.456930072" lastFinishedPulling="2025-10-04 02:49:58.853800074 +0000 UTC m=+578.750758712" observedRunningTime="2025-10-04 02:49:59.767126119 +0000 UTC m=+579.664084817" watchObservedRunningTime="2025-10-04 02:49:59.773471212 +0000 UTC m=+579.670429890" Oct 04 02:49:59 crc kubenswrapper[4964]: I1004 02:49:59.803803 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-dksdz" podStartSLOduration=2.501081999 podStartE2EDuration="5.803773961s" podCreationTimestamp="2025-10-04 02:49:54 +0000 UTC" firstStartedPulling="2025-10-04 02:49:55.541769566 +0000 UTC m=+575.438728214" lastFinishedPulling="2025-10-04 02:49:58.844461548 +0000 UTC m=+578.741420176" observedRunningTime="2025-10-04 02:49:59.793126399 +0000 UTC m=+579.690085087" watchObservedRunningTime="2025-10-04 02:49:59.803773961 +0000 UTC m=+579.700732649" Oct 04 02:50:04 crc kubenswrapper[4964]: I1004 02:50:04.448989 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:50:04 crc kubenswrapper[4964]: I1004 02:50:04.449249 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.079509 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-dksdz" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.557268 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xrs78"] Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.557970 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovn-controller" containerID="cri-o://1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f" gracePeriod=30 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.558054 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="nbdb" containerID="cri-o://45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032" gracePeriod=30 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.558189 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54" gracePeriod=30 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.558246 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovn-acl-logging" containerID="cri-o://16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede" gracePeriod=30 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.558232 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="northd" containerID="cri-o://d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16" gracePeriod=30 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.558213 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="kube-rbac-proxy-node" containerID="cri-o://4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91" gracePeriod=30 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.558525 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="sbdb" containerID="cri-o://fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f" gracePeriod=30 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.614173 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" containerID="cri-o://34418f62ccb97659c09eb2365f742b6446c010e8e64820cc506fd20baa60f0e0" gracePeriod=30 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.797569 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovnkube-controller/3.log" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.801734 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovn-acl-logging/0.log" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802165 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovn-controller/0.log" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802466 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="34418f62ccb97659c09eb2365f742b6446c010e8e64820cc506fd20baa60f0e0" exitCode=0 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802484 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f" exitCode=0 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802492 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032" exitCode=0 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802501 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16" exitCode=0 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802509 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54" exitCode=0 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802516 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91" exitCode=0 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802522 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede" exitCode=143 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802528 4964 generic.go:334] "Generic (PLEG): container finished" podID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerID="1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f" exitCode=143 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802546 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"34418f62ccb97659c09eb2365f742b6446c010e8e64820cc506fd20baa60f0e0"} Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802578 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f"} Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802588 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032"} Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802596 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16"} Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802604 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54"} Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802630 4964 scope.go:117] "RemoveContainer" containerID="fdd2f5d8f5352a8a1b89430e85047fec2026f18c83435ceb6d9246fb3df2dc9b" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802630 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91"} Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802702 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede"} Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.802714 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f"} Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.807339 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q6hm8_10ea848d-0322-476d-976d-4ae3ac39910b/kube-multus/2.log" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.807751 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q6hm8_10ea848d-0322-476d-976d-4ae3ac39910b/kube-multus/1.log" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.807782 4964 generic.go:334] "Generic (PLEG): container finished" podID="10ea848d-0322-476d-976d-4ae3ac39910b" containerID="b667cd3fd52cb198a428e96a085b12b34b610116ecc8aab4a77964917a4d4c6c" exitCode=2 Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.807802 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q6hm8" event={"ID":"10ea848d-0322-476d-976d-4ae3ac39910b","Type":"ContainerDied","Data":"b667cd3fd52cb198a428e96a085b12b34b610116ecc8aab4a77964917a4d4c6c"} Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.808141 4964 scope.go:117] "RemoveContainer" containerID="b667cd3fd52cb198a428e96a085b12b34b610116ecc8aab4a77964917a4d4c6c" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.808357 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-q6hm8_openshift-multus(10ea848d-0322-476d-976d-4ae3ac39910b)\"" pod="openshift-multus/multus-q6hm8" podUID="10ea848d-0322-476d-976d-4ae3ac39910b" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.882248 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovn-acl-logging/0.log" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.882981 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovn-controller/0.log" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.883508 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.885610 4964 scope.go:117] "RemoveContainer" containerID="a79e79fc7fe0ec6d21d20e93ea04d085d1d23ee8bf1a2c766f100bf1d53b804d" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.933597 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l5jk4"] Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.933956 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="kubecfg-setup" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.934013 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="kubecfg-setup" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.934081 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="sbdb" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.934134 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="sbdb" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.934184 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.934231 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.934280 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="kube-rbac-proxy-ovn-metrics" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.934329 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="kube-rbac-proxy-ovn-metrics" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.934377 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="kube-rbac-proxy-node" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.934427 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="kube-rbac-proxy-node" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.934486 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="northd" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.934533 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="northd" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.934636 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.934693 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.934744 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.934794 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.934838 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovn-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.934884 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovn-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.934936 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.934988 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.935036 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="nbdb" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935083 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="nbdb" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.935133 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovn-acl-logging" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935180 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovn-acl-logging" Oct 04 02:50:05 crc kubenswrapper[4964]: E1004 02:50:05.935241 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935290 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935421 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovn-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935478 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="sbdb" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935534 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935580 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="kube-rbac-proxy-ovn-metrics" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935640 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovn-acl-logging" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935696 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935746 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="kube-rbac-proxy-node" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935792 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935839 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="nbdb" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935896 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.935945 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="northd" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.936148 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" containerName="ovnkube-controller" Oct 04 02:50:05 crc kubenswrapper[4964]: I1004 02:50:05.937876 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008438 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-cni-netd\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008480 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-slash\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008503 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-run-netns\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008520 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-openvswitch\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008533 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-kubelet\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008552 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-etc-openvswitch\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008568 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-var-lib-openvswitch\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008602 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-systemd-units\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008631 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-log-socket\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008683 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-log-socket" (OuterVolumeSpecName: "log-socket") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008711 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008728 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-slash" (OuterVolumeSpecName: "host-slash") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008745 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008762 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008777 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008770 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovn-node-metrics-cert\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008792 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008821 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008804 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.008950 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-systemd\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.009513 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqkx7\" (UniqueName: \"kubernetes.io/projected/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-kube-api-access-mqkx7\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.009571 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-node-log\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.009637 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovnkube-script-lib\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.009698 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovnkube-config\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.009743 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-ovn\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.009795 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-env-overrides\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.009833 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.009884 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-cni-bin\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.009912 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-run-ovn-kubernetes\") pod \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\" (UID: \"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9\") " Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010086 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010127 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010247 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-run-openvswitch\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010232 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010253 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010311 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010354 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-node-log" (OuterVolumeSpecName: "node-log") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010379 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-slash\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010470 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-systemd-units\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010495 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-ovnkube-script-lib\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010533 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-run-systemd\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010579 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxs7s\" (UniqueName: \"kubernetes.io/projected/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-kube-api-access-pxs7s\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010599 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-env-overrides\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010639 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010661 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-etc-openvswitch\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010679 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-ovn-node-metrics-cert\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010711 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-cni-netd\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010730 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-run-ovn\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010766 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-run-netns\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010764 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010771 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010824 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-log-socket\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010950 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-ovnkube-config\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.010959 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011087 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-kubelet\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011138 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-cni-bin\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011270 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-node-log\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011314 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-var-lib-openvswitch\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011370 4964 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011390 4964 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011405 4964 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011416 4964 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-slash\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011427 4964 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011439 4964 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011451 4964 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011462 4964 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011473 4964 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011485 4964 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011496 4964 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-log-socket\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011508 4964 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-node-log\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011519 4964 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011530 4964 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011541 4964 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011552 4964 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.011564 4964 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.014411 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.015295 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-kube-api-access-mqkx7" (OuterVolumeSpecName: "kube-api-access-mqkx7") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "kube-api-access-mqkx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.021321 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" (UID: "74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.112335 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-log-socket\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.112699 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-ovnkube-config\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.112870 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-cni-bin\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.112969 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-cni-bin\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.112538 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-log-socket\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.113208 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-kubelet\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.113269 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-kubelet\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.113368 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-node-log\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.113590 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-node-log\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.113782 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-var-lib-openvswitch\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.113661 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-var-lib-openvswitch\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.113818 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-ovnkube-config\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.113980 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114111 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-run-openvswitch\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114152 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-slash\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114064 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114195 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-run-openvswitch\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114190 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-systemd-units\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114263 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-slash\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114275 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-ovnkube-script-lib\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114335 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-run-systemd\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114372 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxs7s\" (UniqueName: \"kubernetes.io/projected/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-kube-api-access-pxs7s\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114408 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-env-overrides\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114455 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114492 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-etc-openvswitch\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114556 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-ovn-node-metrics-cert\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114650 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-cni-netd\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114679 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-run-ovn\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114723 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-run-netns\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114800 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114865 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-etc-openvswitch\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114888 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-run-netns\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114231 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-systemd-units\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114928 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-run-ovn\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114932 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-host-cni-netd\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114957 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-run-systemd\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114818 4964 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.114998 4964 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.115020 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqkx7\" (UniqueName: \"kubernetes.io/projected/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9-kube-api-access-mqkx7\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.115413 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-ovnkube-script-lib\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.115787 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-env-overrides\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.117350 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-ovn-node-metrics-cert\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.133033 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxs7s\" (UniqueName: \"kubernetes.io/projected/61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0-kube-api-access-pxs7s\") pod \"ovnkube-node-l5jk4\" (UID: \"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.254109 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:06 crc kubenswrapper[4964]: W1004 02:50:06.284387 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61033e7d_4f5a_40c6_8ff5_3a2aba23f9e0.slice/crio-23e151bc99a448c8e914a6801e668c2fd36ea9aba2d7aee2fa7ea85db0482570 WatchSource:0}: Error finding container 23e151bc99a448c8e914a6801e668c2fd36ea9aba2d7aee2fa7ea85db0482570: Status 404 returned error can't find the container with id 23e151bc99a448c8e914a6801e668c2fd36ea9aba2d7aee2fa7ea85db0482570 Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.819887 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q6hm8_10ea848d-0322-476d-976d-4ae3ac39910b/kube-multus/2.log" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.826772 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovn-acl-logging/0.log" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.828010 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xrs78_74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/ovn-controller/0.log" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.828854 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" event={"ID":"74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9","Type":"ContainerDied","Data":"9a613c2410497bf33723cd282e13fa4d849033737eca1cf90b1df50904a015b5"} Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.828938 4964 scope.go:117] "RemoveContainer" containerID="34418f62ccb97659c09eb2365f742b6446c010e8e64820cc506fd20baa60f0e0" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.828943 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xrs78" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.831273 4964 generic.go:334] "Generic (PLEG): container finished" podID="61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0" containerID="67092e4e8369d3c329625fd95e13271ac3906fe32d209e3859e22a8ae6a51516" exitCode=0 Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.831328 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" event={"ID":"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0","Type":"ContainerDied","Data":"67092e4e8369d3c329625fd95e13271ac3906fe32d209e3859e22a8ae6a51516"} Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.831370 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" event={"ID":"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0","Type":"ContainerStarted","Data":"23e151bc99a448c8e914a6801e668c2fd36ea9aba2d7aee2fa7ea85db0482570"} Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.873667 4964 scope.go:117] "RemoveContainer" containerID="fc2b79d4e123dd8c69baf3d5702e877807ca4ca5b38b3bc5411aea0381a4397f" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.897803 4964 scope.go:117] "RemoveContainer" containerID="45bfdb8e36808d0b4ea3e123f6fe142cc442b594b5f692c898b9f28264fa0032" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.918251 4964 scope.go:117] "RemoveContainer" containerID="d76b54590071ac879867d974961956227c3f66695093bff6cff918d02a28db16" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.944726 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xrs78"] Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.948838 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xrs78"] Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.949444 4964 scope.go:117] "RemoveContainer" containerID="00c8188843cf05c149d5b2cd58412925fc4f7df02c2cbce30c061574e5974a54" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.967252 4964 scope.go:117] "RemoveContainer" containerID="4f0589aa1de2f235659cc308ff95c2f739b19fc8216b6d17d523fa5f84950a91" Oct 04 02:50:06 crc kubenswrapper[4964]: I1004 02:50:06.987042 4964 scope.go:117] "RemoveContainer" containerID="16fd31fb5b7088db7faa3c0a97b05f15ec29805f5264f2693bcd9db0ac0aeede" Oct 04 02:50:07 crc kubenswrapper[4964]: I1004 02:50:07.005570 4964 scope.go:117] "RemoveContainer" containerID="1bffc3fb8ef815122c43849dfbbb741e50d51898da4c28d82814fe42c7f7802f" Oct 04 02:50:07 crc kubenswrapper[4964]: I1004 02:50:07.024213 4964 scope.go:117] "RemoveContainer" containerID="ed8106c102f10fe051342ca568e4474e2aa23f4bed19d8aca0719f76da16304d" Oct 04 02:50:07 crc kubenswrapper[4964]: I1004 02:50:07.845935 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" event={"ID":"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0","Type":"ContainerStarted","Data":"bd017a415185b82b8e33d0ccdc7b2cbb93438ac901803cb16ef021f9f4d4aa57"} Oct 04 02:50:07 crc kubenswrapper[4964]: I1004 02:50:07.846390 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" event={"ID":"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0","Type":"ContainerStarted","Data":"a5e40d3c894e9a5133cfec19654d39e7f78fa19e3f3d91f456532500f63333e2"} Oct 04 02:50:07 crc kubenswrapper[4964]: I1004 02:50:07.846415 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" event={"ID":"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0","Type":"ContainerStarted","Data":"4cd907035d2744cbf8ff988a3f5f5a9a808541f38f275cbc8ec84ebf2dd63dca"} Oct 04 02:50:07 crc kubenswrapper[4964]: I1004 02:50:07.846433 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" event={"ID":"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0","Type":"ContainerStarted","Data":"1972459d9929407d4dab85735680a1ce1c2178e34118359f3d29fc521e3a58f3"} Oct 04 02:50:07 crc kubenswrapper[4964]: I1004 02:50:07.846450 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" event={"ID":"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0","Type":"ContainerStarted","Data":"ffde62058c1503035d5df24677599cd9281509f2710b9ecbc7f8a063d70776fe"} Oct 04 02:50:07 crc kubenswrapper[4964]: I1004 02:50:07.846466 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" event={"ID":"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0","Type":"ContainerStarted","Data":"39160e9f7c57ee4816382f9a7bc3b3613b68ebc2f5bae1fda81a2706d3e4b2e8"} Oct 04 02:50:08 crc kubenswrapper[4964]: I1004 02:50:08.855070 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9" path="/var/lib/kubelet/pods/74942bdc-b3cd-4b92-8b6e-0daf7c89e4e9/volumes" Oct 04 02:50:09 crc kubenswrapper[4964]: I1004 02:50:09.869006 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" event={"ID":"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0","Type":"ContainerStarted","Data":"171073ae7761fff2a2e80cdf0a2c85fb1331cc8c90f28f7cbfb8fb65caaf9479"} Oct 04 02:50:12 crc kubenswrapper[4964]: I1004 02:50:12.890201 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" event={"ID":"61033e7d-4f5a-40c6-8ff5-3a2aba23f9e0","Type":"ContainerStarted","Data":"b0fa041c5515d6124b2012f9c1513a90e898356e5477f015fd0b22884f55a8e2"} Oct 04 02:50:12 crc kubenswrapper[4964]: I1004 02:50:12.890718 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:12 crc kubenswrapper[4964]: I1004 02:50:12.890734 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:12 crc kubenswrapper[4964]: I1004 02:50:12.926556 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:12 crc kubenswrapper[4964]: I1004 02:50:12.932502 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" podStartSLOduration=7.932481871 podStartE2EDuration="7.932481871s" podCreationTimestamp="2025-10-04 02:50:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:50:12.926215566 +0000 UTC m=+592.823174264" watchObservedRunningTime="2025-10-04 02:50:12.932481871 +0000 UTC m=+592.829440519" Oct 04 02:50:13 crc kubenswrapper[4964]: I1004 02:50:13.897133 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:13 crc kubenswrapper[4964]: I1004 02:50:13.938033 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:19 crc kubenswrapper[4964]: I1004 02:50:19.845488 4964 scope.go:117] "RemoveContainer" containerID="b667cd3fd52cb198a428e96a085b12b34b610116ecc8aab4a77964917a4d4c6c" Oct 04 02:50:19 crc kubenswrapper[4964]: E1004 02:50:19.847674 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-q6hm8_openshift-multus(10ea848d-0322-476d-976d-4ae3ac39910b)\"" pod="openshift-multus/multus-q6hm8" podUID="10ea848d-0322-476d-976d-4ae3ac39910b" Oct 04 02:50:32 crc kubenswrapper[4964]: I1004 02:50:32.845525 4964 scope.go:117] "RemoveContainer" containerID="b667cd3fd52cb198a428e96a085b12b34b610116ecc8aab4a77964917a4d4c6c" Oct 04 02:50:33 crc kubenswrapper[4964]: I1004 02:50:33.025141 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q6hm8_10ea848d-0322-476d-976d-4ae3ac39910b/kube-multus/2.log" Oct 04 02:50:33 crc kubenswrapper[4964]: I1004 02:50:33.025483 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q6hm8" event={"ID":"10ea848d-0322-476d-976d-4ae3ac39910b","Type":"ContainerStarted","Data":"7084d9a41083019f4fb5a6dd89304776565e1d91d66b486ce8dc1362335c9848"} Oct 04 02:50:34 crc kubenswrapper[4964]: I1004 02:50:34.449389 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:50:34 crc kubenswrapper[4964]: I1004 02:50:34.449685 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:50:34 crc kubenswrapper[4964]: I1004 02:50:34.449733 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:50:34 crc kubenswrapper[4964]: I1004 02:50:34.450325 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1137f47c0aa6462aae7846003655bf95319654803dc6e0eaf3afe56be26f6389"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 02:50:34 crc kubenswrapper[4964]: I1004 02:50:34.450389 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://1137f47c0aa6462aae7846003655bf95319654803dc6e0eaf3afe56be26f6389" gracePeriod=600 Oct 04 02:50:35 crc kubenswrapper[4964]: I1004 02:50:35.038110 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="1137f47c0aa6462aae7846003655bf95319654803dc6e0eaf3afe56be26f6389" exitCode=0 Oct 04 02:50:35 crc kubenswrapper[4964]: I1004 02:50:35.038143 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"1137f47c0aa6462aae7846003655bf95319654803dc6e0eaf3afe56be26f6389"} Oct 04 02:50:35 crc kubenswrapper[4964]: I1004 02:50:35.038464 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"6efe9a8f74bbf3944c47eb916499cc67675937487bfe4fb926abac0853174b18"} Oct 04 02:50:35 crc kubenswrapper[4964]: I1004 02:50:35.038482 4964 scope.go:117] "RemoveContainer" containerID="ca513d76044d84d58871dc80cf9d1dc2e3bdff83478b7916d3faa2f268b14909" Oct 04 02:50:36 crc kubenswrapper[4964]: I1004 02:50:36.284715 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5jk4" Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.492579 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6"] Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.493894 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.495862 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.516175 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6"] Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.612548 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8qm6\" (UniqueName: \"kubernetes.io/projected/7dcb37ec-6e78-4a26-897c-96c10ee42aba-kube-api-access-f8qm6\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6\" (UID: \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.612670 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dcb37ec-6e78-4a26-897c-96c10ee42aba-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6\" (UID: \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.612936 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dcb37ec-6e78-4a26-897c-96c10ee42aba-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6\" (UID: \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.714589 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8qm6\" (UniqueName: \"kubernetes.io/projected/7dcb37ec-6e78-4a26-897c-96c10ee42aba-kube-api-access-f8qm6\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6\" (UID: \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.714737 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dcb37ec-6e78-4a26-897c-96c10ee42aba-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6\" (UID: \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.714836 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dcb37ec-6e78-4a26-897c-96c10ee42aba-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6\" (UID: \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.716161 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dcb37ec-6e78-4a26-897c-96c10ee42aba-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6\" (UID: \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.716424 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dcb37ec-6e78-4a26-897c-96c10ee42aba-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6\" (UID: \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.749931 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8qm6\" (UniqueName: \"kubernetes.io/projected/7dcb37ec-6e78-4a26-897c-96c10ee42aba-kube-api-access-f8qm6\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6\" (UID: \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:44 crc kubenswrapper[4964]: I1004 02:50:44.819462 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:45 crc kubenswrapper[4964]: I1004 02:50:45.297440 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6"] Oct 04 02:50:45 crc kubenswrapper[4964]: W1004 02:50:45.308468 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dcb37ec_6e78_4a26_897c_96c10ee42aba.slice/crio-a548f324d1c0b26b3b7404080c67104551a7a8a3b823a5627ab32733f1450868 WatchSource:0}: Error finding container a548f324d1c0b26b3b7404080c67104551a7a8a3b823a5627ab32733f1450868: Status 404 returned error can't find the container with id a548f324d1c0b26b3b7404080c67104551a7a8a3b823a5627ab32733f1450868 Oct 04 02:50:46 crc kubenswrapper[4964]: I1004 02:50:46.127363 4964 generic.go:334] "Generic (PLEG): container finished" podID="7dcb37ec-6e78-4a26-897c-96c10ee42aba" containerID="02ad9477dcf0a2f66736bb52ae6a986950a0544087fd6528062558fe0b1ae19b" exitCode=0 Oct 04 02:50:46 crc kubenswrapper[4964]: I1004 02:50:46.127581 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" event={"ID":"7dcb37ec-6e78-4a26-897c-96c10ee42aba","Type":"ContainerDied","Data":"02ad9477dcf0a2f66736bb52ae6a986950a0544087fd6528062558fe0b1ae19b"} Oct 04 02:50:46 crc kubenswrapper[4964]: I1004 02:50:46.127981 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" event={"ID":"7dcb37ec-6e78-4a26-897c-96c10ee42aba","Type":"ContainerStarted","Data":"a548f324d1c0b26b3b7404080c67104551a7a8a3b823a5627ab32733f1450868"} Oct 04 02:50:49 crc kubenswrapper[4964]: I1004 02:50:49.152898 4964 generic.go:334] "Generic (PLEG): container finished" podID="7dcb37ec-6e78-4a26-897c-96c10ee42aba" containerID="4df0a260ff23aae36b704bff7e1dc01b29d47b74661e1d4be55b800744f203cd" exitCode=0 Oct 04 02:50:49 crc kubenswrapper[4964]: I1004 02:50:49.153020 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" event={"ID":"7dcb37ec-6e78-4a26-897c-96c10ee42aba","Type":"ContainerDied","Data":"4df0a260ff23aae36b704bff7e1dc01b29d47b74661e1d4be55b800744f203cd"} Oct 04 02:50:50 crc kubenswrapper[4964]: I1004 02:50:50.166745 4964 generic.go:334] "Generic (PLEG): container finished" podID="7dcb37ec-6e78-4a26-897c-96c10ee42aba" containerID="2d98ce1d44444f3c567f879ecc551677c91e44000623f360ae6b0b5bf40efece" exitCode=0 Oct 04 02:50:50 crc kubenswrapper[4964]: I1004 02:50:50.166997 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" event={"ID":"7dcb37ec-6e78-4a26-897c-96c10ee42aba","Type":"ContainerDied","Data":"2d98ce1d44444f3c567f879ecc551677c91e44000623f360ae6b0b5bf40efece"} Oct 04 02:50:51 crc kubenswrapper[4964]: I1004 02:50:51.421515 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:51 crc kubenswrapper[4964]: I1004 02:50:51.526454 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dcb37ec-6e78-4a26-897c-96c10ee42aba-bundle\") pod \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\" (UID: \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\") " Oct 04 02:50:51 crc kubenswrapper[4964]: I1004 02:50:51.526592 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8qm6\" (UniqueName: \"kubernetes.io/projected/7dcb37ec-6e78-4a26-897c-96c10ee42aba-kube-api-access-f8qm6\") pod \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\" (UID: \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\") " Oct 04 02:50:51 crc kubenswrapper[4964]: I1004 02:50:51.526714 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dcb37ec-6e78-4a26-897c-96c10ee42aba-util\") pod \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\" (UID: \"7dcb37ec-6e78-4a26-897c-96c10ee42aba\") " Oct 04 02:50:51 crc kubenswrapper[4964]: I1004 02:50:51.527454 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dcb37ec-6e78-4a26-897c-96c10ee42aba-bundle" (OuterVolumeSpecName: "bundle") pod "7dcb37ec-6e78-4a26-897c-96c10ee42aba" (UID: "7dcb37ec-6e78-4a26-897c-96c10ee42aba"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:50:51 crc kubenswrapper[4964]: I1004 02:50:51.537333 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dcb37ec-6e78-4a26-897c-96c10ee42aba-util" (OuterVolumeSpecName: "util") pod "7dcb37ec-6e78-4a26-897c-96c10ee42aba" (UID: "7dcb37ec-6e78-4a26-897c-96c10ee42aba"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:50:51 crc kubenswrapper[4964]: I1004 02:50:51.537871 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dcb37ec-6e78-4a26-897c-96c10ee42aba-kube-api-access-f8qm6" (OuterVolumeSpecName: "kube-api-access-f8qm6") pod "7dcb37ec-6e78-4a26-897c-96c10ee42aba" (UID: "7dcb37ec-6e78-4a26-897c-96c10ee42aba"). InnerVolumeSpecName "kube-api-access-f8qm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:50:51 crc kubenswrapper[4964]: I1004 02:50:51.628692 4964 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dcb37ec-6e78-4a26-897c-96c10ee42aba-util\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:51 crc kubenswrapper[4964]: I1004 02:50:51.628745 4964 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dcb37ec-6e78-4a26-897c-96c10ee42aba-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:51 crc kubenswrapper[4964]: I1004 02:50:51.628767 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8qm6\" (UniqueName: \"kubernetes.io/projected/7dcb37ec-6e78-4a26-897c-96c10ee42aba-kube-api-access-f8qm6\") on node \"crc\" DevicePath \"\"" Oct 04 02:50:52 crc kubenswrapper[4964]: I1004 02:50:52.183259 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" event={"ID":"7dcb37ec-6e78-4a26-897c-96c10ee42aba","Type":"ContainerDied","Data":"a548f324d1c0b26b3b7404080c67104551a7a8a3b823a5627ab32733f1450868"} Oct 04 02:50:52 crc kubenswrapper[4964]: I1004 02:50:52.183317 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a548f324d1c0b26b3b7404080c67104551a7a8a3b823a5627ab32733f1450868" Oct 04 02:50:52 crc kubenswrapper[4964]: I1004 02:50:52.183355 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.561043 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-c2nvh"] Oct 04 02:50:53 crc kubenswrapper[4964]: E1004 02:50:53.561512 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcb37ec-6e78-4a26-897c-96c10ee42aba" containerName="pull" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.561527 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcb37ec-6e78-4a26-897c-96c10ee42aba" containerName="pull" Oct 04 02:50:53 crc kubenswrapper[4964]: E1004 02:50:53.561585 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcb37ec-6e78-4a26-897c-96c10ee42aba" containerName="util" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.561595 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcb37ec-6e78-4a26-897c-96c10ee42aba" containerName="util" Oct 04 02:50:53 crc kubenswrapper[4964]: E1004 02:50:53.561606 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcb37ec-6e78-4a26-897c-96c10ee42aba" containerName="extract" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.561614 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcb37ec-6e78-4a26-897c-96c10ee42aba" containerName="extract" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.561781 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dcb37ec-6e78-4a26-897c-96c10ee42aba" containerName="extract" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.562218 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-c2nvh" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.564587 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.565164 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.566512 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-wgbkx" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.617784 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-c2nvh"] Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.655113 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmzsz\" (UniqueName: \"kubernetes.io/projected/9bf5755a-6882-4f0e-9146-0d925ad5ccc5-kube-api-access-tmzsz\") pod \"nmstate-operator-858ddd8f98-c2nvh\" (UID: \"9bf5755a-6882-4f0e-9146-0d925ad5ccc5\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-c2nvh" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.756840 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmzsz\" (UniqueName: \"kubernetes.io/projected/9bf5755a-6882-4f0e-9146-0d925ad5ccc5-kube-api-access-tmzsz\") pod \"nmstate-operator-858ddd8f98-c2nvh\" (UID: \"9bf5755a-6882-4f0e-9146-0d925ad5ccc5\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-c2nvh" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.774332 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmzsz\" (UniqueName: \"kubernetes.io/projected/9bf5755a-6882-4f0e-9146-0d925ad5ccc5-kube-api-access-tmzsz\") pod \"nmstate-operator-858ddd8f98-c2nvh\" (UID: \"9bf5755a-6882-4f0e-9146-0d925ad5ccc5\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-c2nvh" Oct 04 02:50:53 crc kubenswrapper[4964]: I1004 02:50:53.881281 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-c2nvh" Oct 04 02:50:54 crc kubenswrapper[4964]: I1004 02:50:54.324253 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-c2nvh"] Oct 04 02:50:55 crc kubenswrapper[4964]: I1004 02:50:55.205816 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-c2nvh" event={"ID":"9bf5755a-6882-4f0e-9146-0d925ad5ccc5","Type":"ContainerStarted","Data":"1d7d1fd89db89bdb5db6aee331341ad08626d9fa30c988a3febeb2939ed84a98"} Oct 04 02:50:57 crc kubenswrapper[4964]: I1004 02:50:57.219488 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-c2nvh" event={"ID":"9bf5755a-6882-4f0e-9146-0d925ad5ccc5","Type":"ContainerStarted","Data":"2689bc3085ab6fa4a5c1a31b7077688d5f46c668579535ff333e8b6e3d1cfdb1"} Oct 04 02:50:57 crc kubenswrapper[4964]: I1004 02:50:57.245511 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-c2nvh" podStartSLOduration=2.171767088 podStartE2EDuration="4.245488307s" podCreationTimestamp="2025-10-04 02:50:53 +0000 UTC" firstStartedPulling="2025-10-04 02:50:54.334932719 +0000 UTC m=+634.231891357" lastFinishedPulling="2025-10-04 02:50:56.408653948 +0000 UTC m=+636.305612576" observedRunningTime="2025-10-04 02:50:57.245288472 +0000 UTC m=+637.142247150" watchObservedRunningTime="2025-10-04 02:50:57.245488307 +0000 UTC m=+637.142446985" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.223163 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-s5r9n"] Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.224788 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-s5r9n" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.228421 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-s5r9n"] Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.232571 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-nw2lv" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.234033 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b"] Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.234647 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.235971 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.250168 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-9z4n9"] Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.251666 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.267503 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b"] Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.323342 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7rgx\" (UniqueName: \"kubernetes.io/projected/ffcb8f9b-4f68-48ed-a155-99a05b8f508b-kube-api-access-w7rgx\") pod \"nmstate-metrics-fdff9cb8d-s5r9n\" (UID: \"ffcb8f9b-4f68-48ed-a155-99a05b8f508b\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-s5r9n" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.359737 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl"] Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.360503 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.362761 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.362775 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-h8lxb" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.362838 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.395940 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl"] Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.424922 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c17a3833-2ef7-4e9b-a1b0-b065ead5133f-nmstate-lock\") pod \"nmstate-handler-9z4n9\" (UID: \"c17a3833-2ef7-4e9b-a1b0-b065ead5133f\") " pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.424958 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c17a3833-2ef7-4e9b-a1b0-b065ead5133f-dbus-socket\") pod \"nmstate-handler-9z4n9\" (UID: \"c17a3833-2ef7-4e9b-a1b0-b065ead5133f\") " pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.424994 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcx86\" (UniqueName: \"kubernetes.io/projected/c17a3833-2ef7-4e9b-a1b0-b065ead5133f-kube-api-access-tcx86\") pod \"nmstate-handler-9z4n9\" (UID: \"c17a3833-2ef7-4e9b-a1b0-b065ead5133f\") " pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.425037 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7rgx\" (UniqueName: \"kubernetes.io/projected/ffcb8f9b-4f68-48ed-a155-99a05b8f508b-kube-api-access-w7rgx\") pod \"nmstate-metrics-fdff9cb8d-s5r9n\" (UID: \"ffcb8f9b-4f68-48ed-a155-99a05b8f508b\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-s5r9n" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.425055 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c17a3833-2ef7-4e9b-a1b0-b065ead5133f-ovs-socket\") pod \"nmstate-handler-9z4n9\" (UID: \"c17a3833-2ef7-4e9b-a1b0-b065ead5133f\") " pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.425173 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7grzz\" (UniqueName: \"kubernetes.io/projected/db3cd097-a1d9-40ac-af2c-e5d35c8fcd95-kube-api-access-7grzz\") pod \"nmstate-webhook-6cdbc54649-clc2b\" (UID: \"db3cd097-a1d9-40ac-af2c-e5d35c8fcd95\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.425363 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db3cd097-a1d9-40ac-af2c-e5d35c8fcd95-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-clc2b\" (UID: \"db3cd097-a1d9-40ac-af2c-e5d35c8fcd95\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.444729 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7rgx\" (UniqueName: \"kubernetes.io/projected/ffcb8f9b-4f68-48ed-a155-99a05b8f508b-kube-api-access-w7rgx\") pod \"nmstate-metrics-fdff9cb8d-s5r9n\" (UID: \"ffcb8f9b-4f68-48ed-a155-99a05b8f508b\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-s5r9n" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.526467 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c17a3833-2ef7-4e9b-a1b0-b065ead5133f-nmstate-lock\") pod \"nmstate-handler-9z4n9\" (UID: \"c17a3833-2ef7-4e9b-a1b0-b065ead5133f\") " pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.526756 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c17a3833-2ef7-4e9b-a1b0-b065ead5133f-dbus-socket\") pod \"nmstate-handler-9z4n9\" (UID: \"c17a3833-2ef7-4e9b-a1b0-b065ead5133f\") " pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.526594 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c17a3833-2ef7-4e9b-a1b0-b065ead5133f-nmstate-lock\") pod \"nmstate-handler-9z4n9\" (UID: \"c17a3833-2ef7-4e9b-a1b0-b065ead5133f\") " pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.526788 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c056ed4-2fd6-42dd-8702-a84d27d26fd2-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-z47kl\" (UID: \"3c056ed4-2fd6-42dd-8702-a84d27d26fd2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.526947 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcx86\" (UniqueName: \"kubernetes.io/projected/c17a3833-2ef7-4e9b-a1b0-b065ead5133f-kube-api-access-tcx86\") pod \"nmstate-handler-9z4n9\" (UID: \"c17a3833-2ef7-4e9b-a1b0-b065ead5133f\") " pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.527001 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfcxg\" (UniqueName: \"kubernetes.io/projected/3c056ed4-2fd6-42dd-8702-a84d27d26fd2-kube-api-access-dfcxg\") pod \"nmstate-console-plugin-6b874cbd85-z47kl\" (UID: \"3c056ed4-2fd6-42dd-8702-a84d27d26fd2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.527082 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c17a3833-2ef7-4e9b-a1b0-b065ead5133f-dbus-socket\") pod \"nmstate-handler-9z4n9\" (UID: \"c17a3833-2ef7-4e9b-a1b0-b065ead5133f\") " pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.527120 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c17a3833-2ef7-4e9b-a1b0-b065ead5133f-ovs-socket\") pod \"nmstate-handler-9z4n9\" (UID: \"c17a3833-2ef7-4e9b-a1b0-b065ead5133f\") " pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.527214 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c17a3833-2ef7-4e9b-a1b0-b065ead5133f-ovs-socket\") pod \"nmstate-handler-9z4n9\" (UID: \"c17a3833-2ef7-4e9b-a1b0-b065ead5133f\") " pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.527229 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3c056ed4-2fd6-42dd-8702-a84d27d26fd2-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-z47kl\" (UID: \"3c056ed4-2fd6-42dd-8702-a84d27d26fd2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.527343 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7grzz\" (UniqueName: \"kubernetes.io/projected/db3cd097-a1d9-40ac-af2c-e5d35c8fcd95-kube-api-access-7grzz\") pod \"nmstate-webhook-6cdbc54649-clc2b\" (UID: \"db3cd097-a1d9-40ac-af2c-e5d35c8fcd95\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.527425 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db3cd097-a1d9-40ac-af2c-e5d35c8fcd95-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-clc2b\" (UID: \"db3cd097-a1d9-40ac-af2c-e5d35c8fcd95\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.531728 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db3cd097-a1d9-40ac-af2c-e5d35c8fcd95-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-clc2b\" (UID: \"db3cd097-a1d9-40ac-af2c-e5d35c8fcd95\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.543661 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcx86\" (UniqueName: \"kubernetes.io/projected/c17a3833-2ef7-4e9b-a1b0-b065ead5133f-kube-api-access-tcx86\") pod \"nmstate-handler-9z4n9\" (UID: \"c17a3833-2ef7-4e9b-a1b0-b065ead5133f\") " pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.544482 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-s5r9n" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.545721 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-747995869-8hq7g"] Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.546719 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.566300 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7grzz\" (UniqueName: \"kubernetes.io/projected/db3cd097-a1d9-40ac-af2c-e5d35c8fcd95-kube-api-access-7grzz\") pod \"nmstate-webhook-6cdbc54649-clc2b\" (UID: \"db3cd097-a1d9-40ac-af2c-e5d35c8fcd95\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.579338 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-747995869-8hq7g"] Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.582896 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.629159 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3c056ed4-2fd6-42dd-8702-a84d27d26fd2-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-z47kl\" (UID: \"3c056ed4-2fd6-42dd-8702-a84d27d26fd2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.629233 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c056ed4-2fd6-42dd-8702-a84d27d26fd2-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-z47kl\" (UID: \"3c056ed4-2fd6-42dd-8702-a84d27d26fd2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.629263 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfcxg\" (UniqueName: \"kubernetes.io/projected/3c056ed4-2fd6-42dd-8702-a84d27d26fd2-kube-api-access-dfcxg\") pod \"nmstate-console-plugin-6b874cbd85-z47kl\" (UID: \"3c056ed4-2fd6-42dd-8702-a84d27d26fd2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.630322 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3c056ed4-2fd6-42dd-8702-a84d27d26fd2-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-z47kl\" (UID: \"3c056ed4-2fd6-42dd-8702-a84d27d26fd2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.647380 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c056ed4-2fd6-42dd-8702-a84d27d26fd2-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-z47kl\" (UID: \"3c056ed4-2fd6-42dd-8702-a84d27d26fd2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.662860 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfcxg\" (UniqueName: \"kubernetes.io/projected/3c056ed4-2fd6-42dd-8702-a84d27d26fd2-kube-api-access-dfcxg\") pod \"nmstate-console-plugin-6b874cbd85-z47kl\" (UID: \"3c056ed4-2fd6-42dd-8702-a84d27d26fd2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.677792 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.730343 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgsn7\" (UniqueName: \"kubernetes.io/projected/49830f61-9325-4e36-9e39-edb72ce5dce0-kube-api-access-vgsn7\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.730393 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49830f61-9325-4e36-9e39-edb72ce5dce0-service-ca\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.730418 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49830f61-9325-4e36-9e39-edb72ce5dce0-console-serving-cert\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.730438 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49830f61-9325-4e36-9e39-edb72ce5dce0-oauth-serving-cert\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.730454 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49830f61-9325-4e36-9e39-edb72ce5dce0-console-oauth-config\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.730474 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49830f61-9325-4e36-9e39-edb72ce5dce0-trusted-ca-bundle\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.730517 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49830f61-9325-4e36-9e39-edb72ce5dce0-console-config\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.763006 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-s5r9n"] Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.831188 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49830f61-9325-4e36-9e39-edb72ce5dce0-console-config\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.831229 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgsn7\" (UniqueName: \"kubernetes.io/projected/49830f61-9325-4e36-9e39-edb72ce5dce0-kube-api-access-vgsn7\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.831258 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49830f61-9325-4e36-9e39-edb72ce5dce0-service-ca\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.831279 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49830f61-9325-4e36-9e39-edb72ce5dce0-console-serving-cert\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.831299 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49830f61-9325-4e36-9e39-edb72ce5dce0-oauth-serving-cert\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.831315 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49830f61-9325-4e36-9e39-edb72ce5dce0-console-oauth-config\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.831336 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49830f61-9325-4e36-9e39-edb72ce5dce0-trusted-ca-bundle\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.832337 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49830f61-9325-4e36-9e39-edb72ce5dce0-trusted-ca-bundle\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.832357 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49830f61-9325-4e36-9e39-edb72ce5dce0-console-config\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.832513 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49830f61-9325-4e36-9e39-edb72ce5dce0-service-ca\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.832537 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49830f61-9325-4e36-9e39-edb72ce5dce0-oauth-serving-cert\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.835828 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49830f61-9325-4e36-9e39-edb72ce5dce0-console-serving-cert\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.843554 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49830f61-9325-4e36-9e39-edb72ce5dce0-console-oauth-config\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.852746 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgsn7\" (UniqueName: \"kubernetes.io/projected/49830f61-9325-4e36-9e39-edb72ce5dce0-kube-api-access-vgsn7\") pod \"console-747995869-8hq7g\" (UID: \"49830f61-9325-4e36-9e39-edb72ce5dce0\") " pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.853945 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" Oct 04 02:50:58 crc kubenswrapper[4964]: I1004 02:50:58.907095 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-747995869-8hq7g" Oct 04 02:50:59 crc kubenswrapper[4964]: I1004 02:50:59.011997 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b"] Oct 04 02:50:59 crc kubenswrapper[4964]: W1004 02:50:59.018933 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb3cd097_a1d9_40ac_af2c_e5d35c8fcd95.slice/crio-00b6057fa8241d674d8f5c23330e5a6b11fd9a155b62105cc4a25d11b56c82e1 WatchSource:0}: Error finding container 00b6057fa8241d674d8f5c23330e5a6b11fd9a155b62105cc4a25d11b56c82e1: Status 404 returned error can't find the container with id 00b6057fa8241d674d8f5c23330e5a6b11fd9a155b62105cc4a25d11b56c82e1 Oct 04 02:50:59 crc kubenswrapper[4964]: I1004 02:50:59.079650 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-747995869-8hq7g"] Oct 04 02:50:59 crc kubenswrapper[4964]: W1004 02:50:59.083397 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49830f61_9325_4e36_9e39_edb72ce5dce0.slice/crio-8e6f300428612c37f06a55ee3a17b6be26aaa5127cc7d82acf69970441d09e54 WatchSource:0}: Error finding container 8e6f300428612c37f06a55ee3a17b6be26aaa5127cc7d82acf69970441d09e54: Status 404 returned error can't find the container with id 8e6f300428612c37f06a55ee3a17b6be26aaa5127cc7d82acf69970441d09e54 Oct 04 02:50:59 crc kubenswrapper[4964]: I1004 02:50:59.099177 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl"] Oct 04 02:50:59 crc kubenswrapper[4964]: W1004 02:50:59.103514 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c056ed4_2fd6_42dd_8702_a84d27d26fd2.slice/crio-5557ff3257efe3be96eee5350630c05c63daa197be70a867bc597031ea345914 WatchSource:0}: Error finding container 5557ff3257efe3be96eee5350630c05c63daa197be70a867bc597031ea345914: Status 404 returned error can't find the container with id 5557ff3257efe3be96eee5350630c05c63daa197be70a867bc597031ea345914 Oct 04 02:50:59 crc kubenswrapper[4964]: I1004 02:50:59.231945 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-s5r9n" event={"ID":"ffcb8f9b-4f68-48ed-a155-99a05b8f508b","Type":"ContainerStarted","Data":"e173961ce253de3482fb0e78ff77828dcf1af069bd15005c0623cafd49638da8"} Oct 04 02:50:59 crc kubenswrapper[4964]: I1004 02:50:59.234060 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" event={"ID":"db3cd097-a1d9-40ac-af2c-e5d35c8fcd95","Type":"ContainerStarted","Data":"00b6057fa8241d674d8f5c23330e5a6b11fd9a155b62105cc4a25d11b56c82e1"} Oct 04 02:50:59 crc kubenswrapper[4964]: I1004 02:50:59.235047 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" event={"ID":"3c056ed4-2fd6-42dd-8702-a84d27d26fd2","Type":"ContainerStarted","Data":"5557ff3257efe3be96eee5350630c05c63daa197be70a867bc597031ea345914"} Oct 04 02:50:59 crc kubenswrapper[4964]: I1004 02:50:59.235952 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9z4n9" event={"ID":"c17a3833-2ef7-4e9b-a1b0-b065ead5133f","Type":"ContainerStarted","Data":"1d42b462807941393b60e14487923094ffab751a853320a009cab22fa6f539e8"} Oct 04 02:50:59 crc kubenswrapper[4964]: I1004 02:50:59.237284 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-747995869-8hq7g" event={"ID":"49830f61-9325-4e36-9e39-edb72ce5dce0","Type":"ContainerStarted","Data":"7844d1f0486e1f11f15a266b452d2e3198f0fcc36fee40a2ae1d946242206fa3"} Oct 04 02:50:59 crc kubenswrapper[4964]: I1004 02:50:59.237361 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-747995869-8hq7g" event={"ID":"49830f61-9325-4e36-9e39-edb72ce5dce0","Type":"ContainerStarted","Data":"8e6f300428612c37f06a55ee3a17b6be26aaa5127cc7d82acf69970441d09e54"} Oct 04 02:50:59 crc kubenswrapper[4964]: I1004 02:50:59.253787 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-747995869-8hq7g" podStartSLOduration=1.253768848 podStartE2EDuration="1.253768848s" podCreationTimestamp="2025-10-04 02:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:50:59.251042063 +0000 UTC m=+639.148000701" watchObservedRunningTime="2025-10-04 02:50:59.253768848 +0000 UTC m=+639.150727486" Oct 04 02:51:01 crc kubenswrapper[4964]: I1004 02:51:01.249694 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" event={"ID":"db3cd097-a1d9-40ac-af2c-e5d35c8fcd95","Type":"ContainerStarted","Data":"a9ea0ba1f12a8aaa043657d858d7017e1d2af2027ec7804470f1081aa19242d4"} Oct 04 02:51:01 crc kubenswrapper[4964]: I1004 02:51:01.250207 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" Oct 04 02:51:01 crc kubenswrapper[4964]: I1004 02:51:01.251419 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-s5r9n" event={"ID":"ffcb8f9b-4f68-48ed-a155-99a05b8f508b","Type":"ContainerStarted","Data":"2199def3585345a7d380acc50d42d4e45db5e7513a5c3619f8a4341195b42c47"} Oct 04 02:51:01 crc kubenswrapper[4964]: I1004 02:51:01.273236 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" podStartSLOduration=1.299819961 podStartE2EDuration="3.273218161s" podCreationTimestamp="2025-10-04 02:50:58 +0000 UTC" firstStartedPulling="2025-10-04 02:50:59.020939102 +0000 UTC m=+638.917897730" lastFinishedPulling="2025-10-04 02:51:00.994337292 +0000 UTC m=+640.891295930" observedRunningTime="2025-10-04 02:51:01.270224409 +0000 UTC m=+641.167183047" watchObservedRunningTime="2025-10-04 02:51:01.273218161 +0000 UTC m=+641.170176799" Oct 04 02:51:02 crc kubenswrapper[4964]: I1004 02:51:02.260074 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" event={"ID":"3c056ed4-2fd6-42dd-8702-a84d27d26fd2","Type":"ContainerStarted","Data":"897ea73dcf2778f48a3b0647d6d27ab9d9fd20bf468e7c9bd0ee26381efba197"} Oct 04 02:51:02 crc kubenswrapper[4964]: I1004 02:51:02.264193 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9z4n9" event={"ID":"c17a3833-2ef7-4e9b-a1b0-b065ead5133f","Type":"ContainerStarted","Data":"6db699cc8518181668cb3dc78fa9afb04d13e2a867fa8c99d7caef4ccd98ce7a"} Oct 04 02:51:02 crc kubenswrapper[4964]: I1004 02:51:02.264901 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:51:02 crc kubenswrapper[4964]: I1004 02:51:02.287172 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z47kl" podStartSLOduration=1.396360653 podStartE2EDuration="4.287147699s" podCreationTimestamp="2025-10-04 02:50:58 +0000 UTC" firstStartedPulling="2025-10-04 02:50:59.105419383 +0000 UTC m=+639.002378021" lastFinishedPulling="2025-10-04 02:51:01.996206429 +0000 UTC m=+641.893165067" observedRunningTime="2025-10-04 02:51:02.279689414 +0000 UTC m=+642.176648112" watchObservedRunningTime="2025-10-04 02:51:02.287147699 +0000 UTC m=+642.184106357" Oct 04 02:51:02 crc kubenswrapper[4964]: I1004 02:51:02.304867 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-9z4n9" podStartSLOduration=1.997340839 podStartE2EDuration="4.304845485s" podCreationTimestamp="2025-10-04 02:50:58 +0000 UTC" firstStartedPulling="2025-10-04 02:50:58.663731992 +0000 UTC m=+638.560690630" lastFinishedPulling="2025-10-04 02:51:00.971236638 +0000 UTC m=+640.868195276" observedRunningTime="2025-10-04 02:51:02.303740054 +0000 UTC m=+642.200698722" watchObservedRunningTime="2025-10-04 02:51:02.304845485 +0000 UTC m=+642.201804143" Oct 04 02:51:04 crc kubenswrapper[4964]: I1004 02:51:04.277229 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-s5r9n" event={"ID":"ffcb8f9b-4f68-48ed-a155-99a05b8f508b","Type":"ContainerStarted","Data":"d67a124329306fffdff5d0e0afb7f9f8869563d52b8d10bc15e8b383c74855ef"} Oct 04 02:51:04 crc kubenswrapper[4964]: I1004 02:51:04.303760 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-s5r9n" podStartSLOduration=1.816213265 podStartE2EDuration="6.303742144s" podCreationTimestamp="2025-10-04 02:50:58 +0000 UTC" firstStartedPulling="2025-10-04 02:50:58.810533434 +0000 UTC m=+638.707492072" lastFinishedPulling="2025-10-04 02:51:03.298062303 +0000 UTC m=+643.195020951" observedRunningTime="2025-10-04 02:51:04.299635642 +0000 UTC m=+644.196594310" watchObservedRunningTime="2025-10-04 02:51:04.303742144 +0000 UTC m=+644.200700782" Oct 04 02:51:08 crc kubenswrapper[4964]: I1004 02:51:08.617235 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-9z4n9" Oct 04 02:51:08 crc kubenswrapper[4964]: I1004 02:51:08.907535 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-747995869-8hq7g" Oct 04 02:51:08 crc kubenswrapper[4964]: I1004 02:51:08.907637 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-747995869-8hq7g" Oct 04 02:51:08 crc kubenswrapper[4964]: I1004 02:51:08.915973 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-747995869-8hq7g" Oct 04 02:51:09 crc kubenswrapper[4964]: I1004 02:51:09.322570 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-747995869-8hq7g" Oct 04 02:51:09 crc kubenswrapper[4964]: I1004 02:51:09.388666 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-cznct"] Oct 04 02:51:18 crc kubenswrapper[4964]: I1004 02:51:18.862064 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-clc2b" Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.455217 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-cznct" podUID="d30aed64-b8f7-4028-8dfc-f3661ce1c459" containerName="console" containerID="cri-o://57bdf1591cedaa3cb662241ee331590b756a3757be10245cdb11915ea4ffc3f1" gracePeriod=15 Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.515505 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr"] Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.516533 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.519429 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.527891 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr"] Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.660301 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcnhr\" (UniqueName: \"kubernetes.io/projected/eee5f115-fb35-42ad-b604-33c3fc0f4d35-kube-api-access-pcnhr\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr\" (UID: \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.660480 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee5f115-fb35-42ad-b604-33c3fc0f4d35-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr\" (UID: \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.660584 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee5f115-fb35-42ad-b604-33c3fc0f4d35-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr\" (UID: \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.761722 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee5f115-fb35-42ad-b604-33c3fc0f4d35-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr\" (UID: \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.761814 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcnhr\" (UniqueName: \"kubernetes.io/projected/eee5f115-fb35-42ad-b604-33c3fc0f4d35-kube-api-access-pcnhr\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr\" (UID: \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.761916 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee5f115-fb35-42ad-b604-33c3fc0f4d35-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr\" (UID: \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.762425 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee5f115-fb35-42ad-b604-33c3fc0f4d35-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr\" (UID: \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.762487 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee5f115-fb35-42ad-b604-33c3fc0f4d35-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr\" (UID: \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.789227 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcnhr\" (UniqueName: \"kubernetes.io/projected/eee5f115-fb35-42ad-b604-33c3fc0f4d35-kube-api-access-pcnhr\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr\" (UID: \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:34 crc kubenswrapper[4964]: I1004 02:51:34.863122 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.143857 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr"] Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.494106 4964 generic.go:334] "Generic (PLEG): container finished" podID="eee5f115-fb35-42ad-b604-33c3fc0f4d35" containerID="52b38e6b3c6974771872301ca48d97c993a1e5a7bd31b791445d15a2449c9f86" exitCode=0 Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.494645 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" event={"ID":"eee5f115-fb35-42ad-b604-33c3fc0f4d35","Type":"ContainerDied","Data":"52b38e6b3c6974771872301ca48d97c993a1e5a7bd31b791445d15a2449c9f86"} Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.494700 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" event={"ID":"eee5f115-fb35-42ad-b604-33c3fc0f4d35","Type":"ContainerStarted","Data":"ffa72ea94eee7f0306d075419d5114cb761ddaa7d9bd486679d7dc4307a2c7a7"} Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.500966 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-cznct_d30aed64-b8f7-4028-8dfc-f3661ce1c459/console/0.log" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.501051 4964 generic.go:334] "Generic (PLEG): container finished" podID="d30aed64-b8f7-4028-8dfc-f3661ce1c459" containerID="57bdf1591cedaa3cb662241ee331590b756a3757be10245cdb11915ea4ffc3f1" exitCode=2 Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.501095 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cznct" event={"ID":"d30aed64-b8f7-4028-8dfc-f3661ce1c459","Type":"ContainerDied","Data":"57bdf1591cedaa3cb662241ee331590b756a3757be10245cdb11915ea4ffc3f1"} Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.544532 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-cznct_d30aed64-b8f7-4028-8dfc-f3661ce1c459/console/0.log" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.544771 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.693098 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-oauth-serving-cert\") pod \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.693168 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8qxg\" (UniqueName: \"kubernetes.io/projected/d30aed64-b8f7-4028-8dfc-f3661ce1c459-kube-api-access-v8qxg\") pod \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.693306 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-config\") pod \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.693348 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-service-ca\") pod \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.693406 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-serving-cert\") pod \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.693471 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-trusted-ca-bundle\") pod \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.693514 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-oauth-config\") pod \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\" (UID: \"d30aed64-b8f7-4028-8dfc-f3661ce1c459\") " Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.694521 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-config" (OuterVolumeSpecName: "console-config") pod "d30aed64-b8f7-4028-8dfc-f3661ce1c459" (UID: "d30aed64-b8f7-4028-8dfc-f3661ce1c459"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.694587 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d30aed64-b8f7-4028-8dfc-f3661ce1c459" (UID: "d30aed64-b8f7-4028-8dfc-f3661ce1c459"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.694606 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-service-ca" (OuterVolumeSpecName: "service-ca") pod "d30aed64-b8f7-4028-8dfc-f3661ce1c459" (UID: "d30aed64-b8f7-4028-8dfc-f3661ce1c459"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.694679 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d30aed64-b8f7-4028-8dfc-f3661ce1c459" (UID: "d30aed64-b8f7-4028-8dfc-f3661ce1c459"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.699017 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d30aed64-b8f7-4028-8dfc-f3661ce1c459" (UID: "d30aed64-b8f7-4028-8dfc-f3661ce1c459"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.700661 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d30aed64-b8f7-4028-8dfc-f3661ce1c459" (UID: "d30aed64-b8f7-4028-8dfc-f3661ce1c459"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.700699 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30aed64-b8f7-4028-8dfc-f3661ce1c459-kube-api-access-v8qxg" (OuterVolumeSpecName: "kube-api-access-v8qxg") pod "d30aed64-b8f7-4028-8dfc-f3661ce1c459" (UID: "d30aed64-b8f7-4028-8dfc-f3661ce1c459"). InnerVolumeSpecName "kube-api-access-v8qxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.795607 4964 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.795675 4964 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-service-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.795686 4964 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.795696 4964 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.795703 4964 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d30aed64-b8f7-4028-8dfc-f3661ce1c459-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.795713 4964 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d30aed64-b8f7-4028-8dfc-f3661ce1c459-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:51:35 crc kubenswrapper[4964]: I1004 02:51:35.795721 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8qxg\" (UniqueName: \"kubernetes.io/projected/d30aed64-b8f7-4028-8dfc-f3661ce1c459-kube-api-access-v8qxg\") on node \"crc\" DevicePath \"\"" Oct 04 02:51:36 crc kubenswrapper[4964]: I1004 02:51:36.511754 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-cznct_d30aed64-b8f7-4028-8dfc-f3661ce1c459/console/0.log" Oct 04 02:51:36 crc kubenswrapper[4964]: I1004 02:51:36.512300 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cznct" event={"ID":"d30aed64-b8f7-4028-8dfc-f3661ce1c459","Type":"ContainerDied","Data":"7e118f204c2cbb7288943f024e0d8747d0887d595d7625c47a9b2e6a1c6bb755"} Oct 04 02:51:36 crc kubenswrapper[4964]: I1004 02:51:36.512381 4964 scope.go:117] "RemoveContainer" containerID="57bdf1591cedaa3cb662241ee331590b756a3757be10245cdb11915ea4ffc3f1" Oct 04 02:51:36 crc kubenswrapper[4964]: I1004 02:51:36.512381 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cznct" Oct 04 02:51:36 crc kubenswrapper[4964]: I1004 02:51:36.557421 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-cznct"] Oct 04 02:51:36 crc kubenswrapper[4964]: I1004 02:51:36.563396 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-cznct"] Oct 04 02:51:36 crc kubenswrapper[4964]: I1004 02:51:36.857592 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30aed64-b8f7-4028-8dfc-f3661ce1c459" path="/var/lib/kubelet/pods/d30aed64-b8f7-4028-8dfc-f3661ce1c459/volumes" Oct 04 02:51:37 crc kubenswrapper[4964]: I1004 02:51:37.525412 4964 generic.go:334] "Generic (PLEG): container finished" podID="eee5f115-fb35-42ad-b604-33c3fc0f4d35" containerID="2011170df158bce524c9552c697154357d286ab7bc79587a40612222ed2ac897" exitCode=0 Oct 04 02:51:37 crc kubenswrapper[4964]: I1004 02:51:37.525518 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" event={"ID":"eee5f115-fb35-42ad-b604-33c3fc0f4d35","Type":"ContainerDied","Data":"2011170df158bce524c9552c697154357d286ab7bc79587a40612222ed2ac897"} Oct 04 02:51:38 crc kubenswrapper[4964]: I1004 02:51:38.535457 4964 generic.go:334] "Generic (PLEG): container finished" podID="eee5f115-fb35-42ad-b604-33c3fc0f4d35" containerID="822942668faa742b18f0f6ab13d328b9d76efb0033a711b5efbcd5d4e02bcdeb" exitCode=0 Oct 04 02:51:38 crc kubenswrapper[4964]: I1004 02:51:38.535519 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" event={"ID":"eee5f115-fb35-42ad-b604-33c3fc0f4d35","Type":"ContainerDied","Data":"822942668faa742b18f0f6ab13d328b9d76efb0033a711b5efbcd5d4e02bcdeb"} Oct 04 02:51:39 crc kubenswrapper[4964]: I1004 02:51:39.854523 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:39 crc kubenswrapper[4964]: I1004 02:51:39.955406 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee5f115-fb35-42ad-b604-33c3fc0f4d35-util\") pod \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\" (UID: \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\") " Oct 04 02:51:39 crc kubenswrapper[4964]: I1004 02:51:39.955473 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee5f115-fb35-42ad-b604-33c3fc0f4d35-bundle\") pod \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\" (UID: \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\") " Oct 04 02:51:39 crc kubenswrapper[4964]: I1004 02:51:39.955546 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcnhr\" (UniqueName: \"kubernetes.io/projected/eee5f115-fb35-42ad-b604-33c3fc0f4d35-kube-api-access-pcnhr\") pod \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\" (UID: \"eee5f115-fb35-42ad-b604-33c3fc0f4d35\") " Oct 04 02:51:39 crc kubenswrapper[4964]: I1004 02:51:39.957047 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee5f115-fb35-42ad-b604-33c3fc0f4d35-bundle" (OuterVolumeSpecName: "bundle") pod "eee5f115-fb35-42ad-b604-33c3fc0f4d35" (UID: "eee5f115-fb35-42ad-b604-33c3fc0f4d35"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:51:39 crc kubenswrapper[4964]: I1004 02:51:39.963906 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee5f115-fb35-42ad-b604-33c3fc0f4d35-kube-api-access-pcnhr" (OuterVolumeSpecName: "kube-api-access-pcnhr") pod "eee5f115-fb35-42ad-b604-33c3fc0f4d35" (UID: "eee5f115-fb35-42ad-b604-33c3fc0f4d35"). InnerVolumeSpecName "kube-api-access-pcnhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:51:39 crc kubenswrapper[4964]: I1004 02:51:39.978268 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee5f115-fb35-42ad-b604-33c3fc0f4d35-util" (OuterVolumeSpecName: "util") pod "eee5f115-fb35-42ad-b604-33c3fc0f4d35" (UID: "eee5f115-fb35-42ad-b604-33c3fc0f4d35"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:51:40 crc kubenswrapper[4964]: I1004 02:51:40.057905 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcnhr\" (UniqueName: \"kubernetes.io/projected/eee5f115-fb35-42ad-b604-33c3fc0f4d35-kube-api-access-pcnhr\") on node \"crc\" DevicePath \"\"" Oct 04 02:51:40 crc kubenswrapper[4964]: I1004 02:51:40.057971 4964 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee5f115-fb35-42ad-b604-33c3fc0f4d35-util\") on node \"crc\" DevicePath \"\"" Oct 04 02:51:40 crc kubenswrapper[4964]: I1004 02:51:40.057995 4964 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee5f115-fb35-42ad-b604-33c3fc0f4d35-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:51:40 crc kubenswrapper[4964]: I1004 02:51:40.550064 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" event={"ID":"eee5f115-fb35-42ad-b604-33c3fc0f4d35","Type":"ContainerDied","Data":"ffa72ea94eee7f0306d075419d5114cb761ddaa7d9bd486679d7dc4307a2c7a7"} Oct 04 02:51:40 crc kubenswrapper[4964]: I1004 02:51:40.550117 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffa72ea94eee7f0306d075419d5114cb761ddaa7d9bd486679d7dc4307a2c7a7" Oct 04 02:51:40 crc kubenswrapper[4964]: I1004 02:51:40.550116 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.106396 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m"] Oct 04 02:51:49 crc kubenswrapper[4964]: E1004 02:51:49.106976 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee5f115-fb35-42ad-b604-33c3fc0f4d35" containerName="util" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.106987 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee5f115-fb35-42ad-b604-33c3fc0f4d35" containerName="util" Oct 04 02:51:49 crc kubenswrapper[4964]: E1004 02:51:49.106999 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30aed64-b8f7-4028-8dfc-f3661ce1c459" containerName="console" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.107004 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30aed64-b8f7-4028-8dfc-f3661ce1c459" containerName="console" Oct 04 02:51:49 crc kubenswrapper[4964]: E1004 02:51:49.107017 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee5f115-fb35-42ad-b604-33c3fc0f4d35" containerName="extract" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.107022 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee5f115-fb35-42ad-b604-33c3fc0f4d35" containerName="extract" Oct 04 02:51:49 crc kubenswrapper[4964]: E1004 02:51:49.107030 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee5f115-fb35-42ad-b604-33c3fc0f4d35" containerName="pull" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.107035 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee5f115-fb35-42ad-b604-33c3fc0f4d35" containerName="pull" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.107116 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30aed64-b8f7-4028-8dfc-f3661ce1c459" containerName="console" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.107131 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee5f115-fb35-42ad-b604-33c3fc0f4d35" containerName="extract" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.107476 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.109454 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.109823 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.109828 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.110600 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.118396 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-vtdwr" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.125949 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m"] Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.177847 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84f70c83-2c96-4e7c-99c0-4322d4b97f04-webhook-cert\") pod \"metallb-operator-controller-manager-85cc4db5cb-nh55m\" (UID: \"84f70c83-2c96-4e7c-99c0-4322d4b97f04\") " pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.177899 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84f70c83-2c96-4e7c-99c0-4322d4b97f04-apiservice-cert\") pod \"metallb-operator-controller-manager-85cc4db5cb-nh55m\" (UID: \"84f70c83-2c96-4e7c-99c0-4322d4b97f04\") " pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.177960 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvsnh\" (UniqueName: \"kubernetes.io/projected/84f70c83-2c96-4e7c-99c0-4322d4b97f04-kube-api-access-xvsnh\") pod \"metallb-operator-controller-manager-85cc4db5cb-nh55m\" (UID: \"84f70c83-2c96-4e7c-99c0-4322d4b97f04\") " pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.279397 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84f70c83-2c96-4e7c-99c0-4322d4b97f04-webhook-cert\") pod \"metallb-operator-controller-manager-85cc4db5cb-nh55m\" (UID: \"84f70c83-2c96-4e7c-99c0-4322d4b97f04\") " pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.279435 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84f70c83-2c96-4e7c-99c0-4322d4b97f04-apiservice-cert\") pod \"metallb-operator-controller-manager-85cc4db5cb-nh55m\" (UID: \"84f70c83-2c96-4e7c-99c0-4322d4b97f04\") " pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.279510 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvsnh\" (UniqueName: \"kubernetes.io/projected/84f70c83-2c96-4e7c-99c0-4322d4b97f04-kube-api-access-xvsnh\") pod \"metallb-operator-controller-manager-85cc4db5cb-nh55m\" (UID: \"84f70c83-2c96-4e7c-99c0-4322d4b97f04\") " pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.290903 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84f70c83-2c96-4e7c-99c0-4322d4b97f04-webhook-cert\") pod \"metallb-operator-controller-manager-85cc4db5cb-nh55m\" (UID: \"84f70c83-2c96-4e7c-99c0-4322d4b97f04\") " pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.293668 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84f70c83-2c96-4e7c-99c0-4322d4b97f04-apiservice-cert\") pod \"metallb-operator-controller-manager-85cc4db5cb-nh55m\" (UID: \"84f70c83-2c96-4e7c-99c0-4322d4b97f04\") " pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.302848 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvsnh\" (UniqueName: \"kubernetes.io/projected/84f70c83-2c96-4e7c-99c0-4322d4b97f04-kube-api-access-xvsnh\") pod \"metallb-operator-controller-manager-85cc4db5cb-nh55m\" (UID: \"84f70c83-2c96-4e7c-99c0-4322d4b97f04\") " pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.421684 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.426529 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm"] Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.427507 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.441842 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.442537 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-28zkx" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.442976 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.450558 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm"] Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.481683 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvkdd\" (UniqueName: \"kubernetes.io/projected/f95fd202-cbdb-4a99-ac3e-0182838a3c96-kube-api-access-jvkdd\") pod \"metallb-operator-webhook-server-66d576bc7b-2j5gm\" (UID: \"f95fd202-cbdb-4a99-ac3e-0182838a3c96\") " pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.481768 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f95fd202-cbdb-4a99-ac3e-0182838a3c96-apiservice-cert\") pod \"metallb-operator-webhook-server-66d576bc7b-2j5gm\" (UID: \"f95fd202-cbdb-4a99-ac3e-0182838a3c96\") " pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.481805 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f95fd202-cbdb-4a99-ac3e-0182838a3c96-webhook-cert\") pod \"metallb-operator-webhook-server-66d576bc7b-2j5gm\" (UID: \"f95fd202-cbdb-4a99-ac3e-0182838a3c96\") " pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.583803 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvkdd\" (UniqueName: \"kubernetes.io/projected/f95fd202-cbdb-4a99-ac3e-0182838a3c96-kube-api-access-jvkdd\") pod \"metallb-operator-webhook-server-66d576bc7b-2j5gm\" (UID: \"f95fd202-cbdb-4a99-ac3e-0182838a3c96\") " pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.584230 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f95fd202-cbdb-4a99-ac3e-0182838a3c96-apiservice-cert\") pod \"metallb-operator-webhook-server-66d576bc7b-2j5gm\" (UID: \"f95fd202-cbdb-4a99-ac3e-0182838a3c96\") " pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.584270 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f95fd202-cbdb-4a99-ac3e-0182838a3c96-webhook-cert\") pod \"metallb-operator-webhook-server-66d576bc7b-2j5gm\" (UID: \"f95fd202-cbdb-4a99-ac3e-0182838a3c96\") " pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.596852 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f95fd202-cbdb-4a99-ac3e-0182838a3c96-apiservice-cert\") pod \"metallb-operator-webhook-server-66d576bc7b-2j5gm\" (UID: \"f95fd202-cbdb-4a99-ac3e-0182838a3c96\") " pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.598118 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f95fd202-cbdb-4a99-ac3e-0182838a3c96-webhook-cert\") pod \"metallb-operator-webhook-server-66d576bc7b-2j5gm\" (UID: \"f95fd202-cbdb-4a99-ac3e-0182838a3c96\") " pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.609067 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvkdd\" (UniqueName: \"kubernetes.io/projected/f95fd202-cbdb-4a99-ac3e-0182838a3c96-kube-api-access-jvkdd\") pod \"metallb-operator-webhook-server-66d576bc7b-2j5gm\" (UID: \"f95fd202-cbdb-4a99-ac3e-0182838a3c96\") " pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.687903 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m"] Oct 04 02:51:49 crc kubenswrapper[4964]: W1004 02:51:49.695285 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84f70c83_2c96_4e7c_99c0_4322d4b97f04.slice/crio-9d907ace83e7b805365885244592a8177ad832bab490c716996946fa25b80cb7 WatchSource:0}: Error finding container 9d907ace83e7b805365885244592a8177ad832bab490c716996946fa25b80cb7: Status 404 returned error can't find the container with id 9d907ace83e7b805365885244592a8177ad832bab490c716996946fa25b80cb7 Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.773691 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:51:49 crc kubenswrapper[4964]: I1004 02:51:49.964022 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm"] Oct 04 02:51:49 crc kubenswrapper[4964]: W1004 02:51:49.970636 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf95fd202_cbdb_4a99_ac3e_0182838a3c96.slice/crio-36515f05665cf9cc76e80f3520627af2f48d4ee4272d95bac00316386db89ba8 WatchSource:0}: Error finding container 36515f05665cf9cc76e80f3520627af2f48d4ee4272d95bac00316386db89ba8: Status 404 returned error can't find the container with id 36515f05665cf9cc76e80f3520627af2f48d4ee4272d95bac00316386db89ba8 Oct 04 02:51:50 crc kubenswrapper[4964]: I1004 02:51:50.617123 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" event={"ID":"f95fd202-cbdb-4a99-ac3e-0182838a3c96","Type":"ContainerStarted","Data":"36515f05665cf9cc76e80f3520627af2f48d4ee4272d95bac00316386db89ba8"} Oct 04 02:51:50 crc kubenswrapper[4964]: I1004 02:51:50.618578 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" event={"ID":"84f70c83-2c96-4e7c-99c0-4322d4b97f04","Type":"ContainerStarted","Data":"9d907ace83e7b805365885244592a8177ad832bab490c716996946fa25b80cb7"} Oct 04 02:51:53 crc kubenswrapper[4964]: I1004 02:51:53.639488 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" event={"ID":"84f70c83-2c96-4e7c-99c0-4322d4b97f04","Type":"ContainerStarted","Data":"95887a022f3b80c25c7808b5e395b5fc2c1a1c0432e9420ced6960c3c86030a1"} Oct 04 02:51:53 crc kubenswrapper[4964]: I1004 02:51:53.639933 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:51:53 crc kubenswrapper[4964]: I1004 02:51:53.663749 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" podStartSLOduration=1.726609833 podStartE2EDuration="4.663731385s" podCreationTimestamp="2025-10-04 02:51:49 +0000 UTC" firstStartedPulling="2025-10-04 02:51:49.697313079 +0000 UTC m=+689.594271717" lastFinishedPulling="2025-10-04 02:51:52.634434631 +0000 UTC m=+692.531393269" observedRunningTime="2025-10-04 02:51:53.661843944 +0000 UTC m=+693.558802582" watchObservedRunningTime="2025-10-04 02:51:53.663731385 +0000 UTC m=+693.560690013" Oct 04 02:51:54 crc kubenswrapper[4964]: I1004 02:51:54.652664 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" event={"ID":"f95fd202-cbdb-4a99-ac3e-0182838a3c96","Type":"ContainerStarted","Data":"eb05013f8c92bca678682facf9d9f2ab3631047423b484ce2ef57358d1df57d2"} Oct 04 02:51:54 crc kubenswrapper[4964]: I1004 02:51:54.712390 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" podStartSLOduration=1.5628247339999999 podStartE2EDuration="5.712369646s" podCreationTimestamp="2025-10-04 02:51:49 +0000 UTC" firstStartedPulling="2025-10-04 02:51:49.973257499 +0000 UTC m=+689.870216137" lastFinishedPulling="2025-10-04 02:51:54.122802401 +0000 UTC m=+694.019761049" observedRunningTime="2025-10-04 02:51:54.709160181 +0000 UTC m=+694.606118839" watchObservedRunningTime="2025-10-04 02:51:54.712369646 +0000 UTC m=+694.609328294" Oct 04 02:51:55 crc kubenswrapper[4964]: I1004 02:51:55.657680 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:52:09 crc kubenswrapper[4964]: I1004 02:52:09.777512 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-66d576bc7b-2j5gm" Oct 04 02:52:29 crc kubenswrapper[4964]: I1004 02:52:29.424149 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-85cc4db5cb-nh55m" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.233253 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v"] Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.234074 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.234767 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-fs4w8"] Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.236866 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.237213 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.240677 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.241037 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.241238 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-sxn5j" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.243452 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v"] Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.316271 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-s6v9p"] Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.317396 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.319663 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.319885 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zqd2w" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.319919 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.320737 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.344638 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-tvxzc"] Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.346159 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.359132 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.374283 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-tvxzc"] Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375008 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae35709d-6c76-49dd-a685-664e41a117ba-cert\") pod \"frr-k8s-webhook-server-64bf5d555-6xh6v\" (UID: \"ae35709d-6c76-49dd-a685-664e41a117ba\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375034 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7e06f03f-a746-4f64-a49c-16bc836bc682-metrics\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375051 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-memberlist\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375083 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7e06f03f-a746-4f64-a49c-16bc836bc682-frr-startup\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375099 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzjdf\" (UniqueName: \"kubernetes.io/projected/ae35709d-6c76-49dd-a685-664e41a117ba-kube-api-access-pzjdf\") pod \"frr-k8s-webhook-server-64bf5d555-6xh6v\" (UID: \"ae35709d-6c76-49dd-a685-664e41a117ba\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375115 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7e06f03f-a746-4f64-a49c-16bc836bc682-frr-conf\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375137 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/352336ed-63d9-4f9e-86e0-e25db230594a-metallb-excludel2\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375151 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7e06f03f-a746-4f64-a49c-16bc836bc682-reloader\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375171 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcvhb\" (UniqueName: \"kubernetes.io/projected/352336ed-63d9-4f9e-86e0-e25db230594a-kube-api-access-wcvhb\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375188 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7e06f03f-a746-4f64-a49c-16bc836bc682-frr-sockets\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375202 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8w9m\" (UniqueName: \"kubernetes.io/projected/7e06f03f-a746-4f64-a49c-16bc836bc682-kube-api-access-s8w9m\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375219 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-metrics-certs\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.375231 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e06f03f-a746-4f64-a49c-16bc836bc682-metrics-certs\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.476799 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc-cert\") pod \"controller-68d546b9d8-tvxzc\" (UID: \"e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc\") " pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.476984 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae35709d-6c76-49dd-a685-664e41a117ba-cert\") pod \"frr-k8s-webhook-server-64bf5d555-6xh6v\" (UID: \"ae35709d-6c76-49dd-a685-664e41a117ba\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.477051 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7e06f03f-a746-4f64-a49c-16bc836bc682-metrics\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.477098 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-memberlist\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: E1004 02:52:30.477244 4964 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 04 02:52:30 crc kubenswrapper[4964]: E1004 02:52:30.477422 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-memberlist podName:352336ed-63d9-4f9e-86e0-e25db230594a nodeName:}" failed. No retries permitted until 2025-10-04 02:52:30.977401307 +0000 UTC m=+730.874359965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-memberlist") pod "speaker-s6v9p" (UID: "352336ed-63d9-4f9e-86e0-e25db230594a") : secret "metallb-memberlist" not found Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.477768 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7e06f03f-a746-4f64-a49c-16bc836bc682-frr-startup\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.477803 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzjdf\" (UniqueName: \"kubernetes.io/projected/ae35709d-6c76-49dd-a685-664e41a117ba-kube-api-access-pzjdf\") pod \"frr-k8s-webhook-server-64bf5d555-6xh6v\" (UID: \"ae35709d-6c76-49dd-a685-664e41a117ba\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.477828 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7e06f03f-a746-4f64-a49c-16bc836bc682-frr-conf\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.477962 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7e06f03f-a746-4f64-a49c-16bc836bc682-metrics\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.478106 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/352336ed-63d9-4f9e-86e0-e25db230594a-metallb-excludel2\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.478129 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7e06f03f-a746-4f64-a49c-16bc836bc682-reloader\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.478169 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcvhb\" (UniqueName: \"kubernetes.io/projected/352336ed-63d9-4f9e-86e0-e25db230594a-kube-api-access-wcvhb\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.478201 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgbsv\" (UniqueName: \"kubernetes.io/projected/e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc-kube-api-access-xgbsv\") pod \"controller-68d546b9d8-tvxzc\" (UID: \"e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc\") " pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.478610 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7e06f03f-a746-4f64-a49c-16bc836bc682-frr-conf\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.478891 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7e06f03f-a746-4f64-a49c-16bc836bc682-reloader\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.479129 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7e06f03f-a746-4f64-a49c-16bc836bc682-frr-startup\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.479219 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/352336ed-63d9-4f9e-86e0-e25db230594a-metallb-excludel2\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.479926 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7e06f03f-a746-4f64-a49c-16bc836bc682-frr-sockets\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.479487 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7e06f03f-a746-4f64-a49c-16bc836bc682-frr-sockets\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.480169 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8w9m\" (UniqueName: \"kubernetes.io/projected/7e06f03f-a746-4f64-a49c-16bc836bc682-kube-api-access-s8w9m\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.480203 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-metrics-certs\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.480222 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e06f03f-a746-4f64-a49c-16bc836bc682-metrics-certs\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.480269 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc-metrics-certs\") pod \"controller-68d546b9d8-tvxzc\" (UID: \"e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc\") " pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:30 crc kubenswrapper[4964]: E1004 02:52:30.480694 4964 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 04 02:52:30 crc kubenswrapper[4964]: E1004 02:52:30.480741 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-metrics-certs podName:352336ed-63d9-4f9e-86e0-e25db230594a nodeName:}" failed. No retries permitted until 2025-10-04 02:52:30.980723626 +0000 UTC m=+730.877682264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-metrics-certs") pod "speaker-s6v9p" (UID: "352336ed-63d9-4f9e-86e0-e25db230594a") : secret "speaker-certs-secret" not found Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.484686 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e06f03f-a746-4f64-a49c-16bc836bc682-metrics-certs\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.484795 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae35709d-6c76-49dd-a685-664e41a117ba-cert\") pod \"frr-k8s-webhook-server-64bf5d555-6xh6v\" (UID: \"ae35709d-6c76-49dd-a685-664e41a117ba\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.495854 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzjdf\" (UniqueName: \"kubernetes.io/projected/ae35709d-6c76-49dd-a685-664e41a117ba-kube-api-access-pzjdf\") pod \"frr-k8s-webhook-server-64bf5d555-6xh6v\" (UID: \"ae35709d-6c76-49dd-a685-664e41a117ba\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.501250 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8w9m\" (UniqueName: \"kubernetes.io/projected/7e06f03f-a746-4f64-a49c-16bc836bc682-kube-api-access-s8w9m\") pod \"frr-k8s-fs4w8\" (UID: \"7e06f03f-a746-4f64-a49c-16bc836bc682\") " pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.501767 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcvhb\" (UniqueName: \"kubernetes.io/projected/352336ed-63d9-4f9e-86e0-e25db230594a-kube-api-access-wcvhb\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.560494 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.572992 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.581283 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgbsv\" (UniqueName: \"kubernetes.io/projected/e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc-kube-api-access-xgbsv\") pod \"controller-68d546b9d8-tvxzc\" (UID: \"e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc\") " pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.581361 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc-metrics-certs\") pod \"controller-68d546b9d8-tvxzc\" (UID: \"e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc\") " pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.581409 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc-cert\") pod \"controller-68d546b9d8-tvxzc\" (UID: \"e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc\") " pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.584927 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc-metrics-certs\") pod \"controller-68d546b9d8-tvxzc\" (UID: \"e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc\") " pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.585156 4964 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.594603 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc-cert\") pod \"controller-68d546b9d8-tvxzc\" (UID: \"e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc\") " pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.603989 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgbsv\" (UniqueName: \"kubernetes.io/projected/e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc-kube-api-access-xgbsv\") pod \"controller-68d546b9d8-tvxzc\" (UID: \"e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc\") " pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.680762 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.852870 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-tvxzc"] Oct 04 02:52:30 crc kubenswrapper[4964]: W1004 02:52:30.860114 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ac4d52_fb50_4d4a_9e5b_b8a0aed2c6dc.slice/crio-8054531aa16bdc2f6b42ad5ad2189879bf46ed4a3ca63cfc1a9a52438c4ec285 WatchSource:0}: Error finding container 8054531aa16bdc2f6b42ad5ad2189879bf46ed4a3ca63cfc1a9a52438c4ec285: Status 404 returned error can't find the container with id 8054531aa16bdc2f6b42ad5ad2189879bf46ed4a3ca63cfc1a9a52438c4ec285 Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.890314 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-tvxzc" event={"ID":"e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc","Type":"ContainerStarted","Data":"8054531aa16bdc2f6b42ad5ad2189879bf46ed4a3ca63cfc1a9a52438c4ec285"} Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.891584 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fs4w8" event={"ID":"7e06f03f-a746-4f64-a49c-16bc836bc682","Type":"ContainerStarted","Data":"68d960ee27d12ca8eae10a50c68cba8d70317f851ffa28e90868210d6ee52934"} Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.944663 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v"] Oct 04 02:52:30 crc kubenswrapper[4964]: W1004 02:52:30.949236 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae35709d_6c76_49dd_a685_664e41a117ba.slice/crio-bdd5e24cb80156efd4f49e89dff2ee65e87bed4d7f9defad5ddf1855d9377fe5 WatchSource:0}: Error finding container bdd5e24cb80156efd4f49e89dff2ee65e87bed4d7f9defad5ddf1855d9377fe5: Status 404 returned error can't find the container with id bdd5e24cb80156efd4f49e89dff2ee65e87bed4d7f9defad5ddf1855d9377fe5 Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.991410 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-metrics-certs\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: I1004 02:52:30.991973 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-memberlist\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:30 crc kubenswrapper[4964]: E1004 02:52:30.992128 4964 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 04 02:52:30 crc kubenswrapper[4964]: E1004 02:52:30.992173 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-memberlist podName:352336ed-63d9-4f9e-86e0-e25db230594a nodeName:}" failed. No retries permitted until 2025-10-04 02:52:31.99215999 +0000 UTC m=+731.889118628 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-memberlist") pod "speaker-s6v9p" (UID: "352336ed-63d9-4f9e-86e0-e25db230594a") : secret "metallb-memberlist" not found Oct 04 02:52:31 crc kubenswrapper[4964]: I1004 02:52:31.003873 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-metrics-certs\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:31 crc kubenswrapper[4964]: I1004 02:52:31.900691 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-tvxzc" event={"ID":"e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc","Type":"ContainerStarted","Data":"c29cbeafcc95697ea2d1f5cc07126430357d93f68cbe523743d878ce77a4a09e"} Oct 04 02:52:31 crc kubenswrapper[4964]: I1004 02:52:31.900771 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:31 crc kubenswrapper[4964]: I1004 02:52:31.900794 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-tvxzc" event={"ID":"e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc","Type":"ContainerStarted","Data":"79fd0903fa81dfb9c5f5ada762c639a98aa5e7eeea9e0150ac60d4c3bbc962fe"} Oct 04 02:52:31 crc kubenswrapper[4964]: I1004 02:52:31.901894 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" event={"ID":"ae35709d-6c76-49dd-a685-664e41a117ba","Type":"ContainerStarted","Data":"bdd5e24cb80156efd4f49e89dff2ee65e87bed4d7f9defad5ddf1855d9377fe5"} Oct 04 02:52:32 crc kubenswrapper[4964]: I1004 02:52:32.003295 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-memberlist\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:32 crc kubenswrapper[4964]: I1004 02:52:32.009226 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/352336ed-63d9-4f9e-86e0-e25db230594a-memberlist\") pod \"speaker-s6v9p\" (UID: \"352336ed-63d9-4f9e-86e0-e25db230594a\") " pod="metallb-system/speaker-s6v9p" Oct 04 02:52:32 crc kubenswrapper[4964]: I1004 02:52:32.131849 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-s6v9p" Oct 04 02:52:32 crc kubenswrapper[4964]: I1004 02:52:32.914701 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-s6v9p" event={"ID":"352336ed-63d9-4f9e-86e0-e25db230594a","Type":"ContainerStarted","Data":"3df977f8c68ad4f69b4ebb81e9533d25a5622d10615190422740a80ccc2f5249"} Oct 04 02:52:32 crc kubenswrapper[4964]: I1004 02:52:32.914751 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-s6v9p" event={"ID":"352336ed-63d9-4f9e-86e0-e25db230594a","Type":"ContainerStarted","Data":"ecd567b2af1bbf2ce302c8fa35859f10073520a11e08695b42ac54357c81fc48"} Oct 04 02:52:32 crc kubenswrapper[4964]: I1004 02:52:32.914762 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-s6v9p" event={"ID":"352336ed-63d9-4f9e-86e0-e25db230594a","Type":"ContainerStarted","Data":"1f34cf5cdc22af010f4912dbaa9822451a8b96a7ef130986a4b84ebc4134318e"} Oct 04 02:52:32 crc kubenswrapper[4964]: I1004 02:52:32.915003 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-s6v9p" Oct 04 02:52:32 crc kubenswrapper[4964]: I1004 02:52:32.935369 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-tvxzc" podStartSLOduration=2.935351115 podStartE2EDuration="2.935351115s" podCreationTimestamp="2025-10-04 02:52:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:52:31.929192579 +0000 UTC m=+731.826151237" watchObservedRunningTime="2025-10-04 02:52:32.935351115 +0000 UTC m=+732.832309753" Oct 04 02:52:32 crc kubenswrapper[4964]: I1004 02:52:32.937434 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-s6v9p" podStartSLOduration=2.937427339 podStartE2EDuration="2.937427339s" podCreationTimestamp="2025-10-04 02:52:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:52:32.934099012 +0000 UTC m=+732.831057650" watchObservedRunningTime="2025-10-04 02:52:32.937427339 +0000 UTC m=+732.834385977" Oct 04 02:52:34 crc kubenswrapper[4964]: I1004 02:52:34.448872 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:52:34 crc kubenswrapper[4964]: I1004 02:52:34.449180 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:52:38 crc kubenswrapper[4964]: I1004 02:52:38.951395 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" event={"ID":"ae35709d-6c76-49dd-a685-664e41a117ba","Type":"ContainerStarted","Data":"b3f5f9f0f73e69310c6f22d4482ae6a449cdba53efdefe90de4e1b4e190deab5"} Oct 04 02:52:38 crc kubenswrapper[4964]: I1004 02:52:38.952265 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" Oct 04 02:52:38 crc kubenswrapper[4964]: I1004 02:52:38.955676 4964 generic.go:334] "Generic (PLEG): container finished" podID="7e06f03f-a746-4f64-a49c-16bc836bc682" containerID="2f15ce9831a3f0826e25468ea8ddcfc0d7c542e3fce8579fef9bd3f5e18677f2" exitCode=0 Oct 04 02:52:38 crc kubenswrapper[4964]: I1004 02:52:38.955736 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fs4w8" event={"ID":"7e06f03f-a746-4f64-a49c-16bc836bc682","Type":"ContainerDied","Data":"2f15ce9831a3f0826e25468ea8ddcfc0d7c542e3fce8579fef9bd3f5e18677f2"} Oct 04 02:52:38 crc kubenswrapper[4964]: I1004 02:52:38.987918 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" podStartSLOduration=1.628294392 podStartE2EDuration="8.987878547s" podCreationTimestamp="2025-10-04 02:52:30 +0000 UTC" firstStartedPulling="2025-10-04 02:52:30.952097101 +0000 UTC m=+730.849055739" lastFinishedPulling="2025-10-04 02:52:38.311681256 +0000 UTC m=+738.208639894" observedRunningTime="2025-10-04 02:52:38.97588763 +0000 UTC m=+738.872846328" watchObservedRunningTime="2025-10-04 02:52:38.987878547 +0000 UTC m=+738.884837235" Oct 04 02:52:39 crc kubenswrapper[4964]: I1004 02:52:39.973735 4964 generic.go:334] "Generic (PLEG): container finished" podID="7e06f03f-a746-4f64-a49c-16bc836bc682" containerID="163cd06c123a6ad57621ee55b76cff6bba4e4c8687a6b8b6d56fc155b95b6c8a" exitCode=0 Oct 04 02:52:39 crc kubenswrapper[4964]: I1004 02:52:39.973849 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fs4w8" event={"ID":"7e06f03f-a746-4f64-a49c-16bc836bc682","Type":"ContainerDied","Data":"163cd06c123a6ad57621ee55b76cff6bba4e4c8687a6b8b6d56fc155b95b6c8a"} Oct 04 02:52:40 crc kubenswrapper[4964]: I1004 02:52:40.686331 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-tvxzc" Oct 04 02:52:40 crc kubenswrapper[4964]: I1004 02:52:40.983558 4964 generic.go:334] "Generic (PLEG): container finished" podID="7e06f03f-a746-4f64-a49c-16bc836bc682" containerID="1231f073b865c749a5ac41f59d852ed2fa730f8554a5fd6b23515561001bf0b2" exitCode=0 Oct 04 02:52:40 crc kubenswrapper[4964]: I1004 02:52:40.983597 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fs4w8" event={"ID":"7e06f03f-a746-4f64-a49c-16bc836bc682","Type":"ContainerDied","Data":"1231f073b865c749a5ac41f59d852ed2fa730f8554a5fd6b23515561001bf0b2"} Oct 04 02:52:41 crc kubenswrapper[4964]: I1004 02:52:41.994565 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fs4w8" event={"ID":"7e06f03f-a746-4f64-a49c-16bc836bc682","Type":"ContainerStarted","Data":"b2bcfe604da94579fd635c601d9bdb37b499fff725369e49ac1e01d6eee33e9d"} Oct 04 02:52:41 crc kubenswrapper[4964]: I1004 02:52:41.994884 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fs4w8" event={"ID":"7e06f03f-a746-4f64-a49c-16bc836bc682","Type":"ContainerStarted","Data":"7f4604077fc1afd9fd6c40d02c7b3aee7115bd7dff934e580c2ed20e80f9d32b"} Oct 04 02:52:41 crc kubenswrapper[4964]: I1004 02:52:41.994896 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fs4w8" event={"ID":"7e06f03f-a746-4f64-a49c-16bc836bc682","Type":"ContainerStarted","Data":"2f822f4b7d6d40a162b4277d37e393c45500d04efdb468a127a760856dfe2653"} Oct 04 02:52:41 crc kubenswrapper[4964]: I1004 02:52:41.994905 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fs4w8" event={"ID":"7e06f03f-a746-4f64-a49c-16bc836bc682","Type":"ContainerStarted","Data":"98238c0921f3b947cccb758f8f453de34d5cdd6d6b38737d33205bd276ce9478"} Oct 04 02:52:41 crc kubenswrapper[4964]: I1004 02:52:41.994912 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fs4w8" event={"ID":"7e06f03f-a746-4f64-a49c-16bc836bc682","Type":"ContainerStarted","Data":"9df3792679b6d23a1745bcb04ad3e59bdc8fd2fbe837bda98115730e0d47826d"} Oct 04 02:52:42 crc kubenswrapper[4964]: I1004 02:52:42.135789 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-s6v9p" Oct 04 02:52:43 crc kubenswrapper[4964]: I1004 02:52:43.009026 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fs4w8" event={"ID":"7e06f03f-a746-4f64-a49c-16bc836bc682","Type":"ContainerStarted","Data":"208dcaa22598f4a8ccb7b53a6bf4f93468521610d12feea545c84a2944cea6fb"} Oct 04 02:52:43 crc kubenswrapper[4964]: I1004 02:52:43.009553 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.273328 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-fs4w8" podStartSLOduration=7.679811808 podStartE2EDuration="15.273308648s" podCreationTimestamp="2025-10-04 02:52:30 +0000 UTC" firstStartedPulling="2025-10-04 02:52:30.742007805 +0000 UTC m=+730.638966463" lastFinishedPulling="2025-10-04 02:52:38.335504665 +0000 UTC m=+738.232463303" observedRunningTime="2025-10-04 02:52:43.058483199 +0000 UTC m=+742.955441887" watchObservedRunningTime="2025-10-04 02:52:45.273308648 +0000 UTC m=+745.170267276" Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.276166 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mtvpm"] Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.276833 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mtvpm" Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.278458 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.278705 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-w425m" Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.281452 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.293812 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mtvpm"] Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.412666 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvl52\" (UniqueName: \"kubernetes.io/projected/2c77936f-8b10-43f3-af24-af8fa6c201e1-kube-api-access-gvl52\") pod \"openstack-operator-index-mtvpm\" (UID: \"2c77936f-8b10-43f3-af24-af8fa6c201e1\") " pod="openstack-operators/openstack-operator-index-mtvpm" Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.513736 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvl52\" (UniqueName: \"kubernetes.io/projected/2c77936f-8b10-43f3-af24-af8fa6c201e1-kube-api-access-gvl52\") pod \"openstack-operator-index-mtvpm\" (UID: \"2c77936f-8b10-43f3-af24-af8fa6c201e1\") " pod="openstack-operators/openstack-operator-index-mtvpm" Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.533069 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvl52\" (UniqueName: \"kubernetes.io/projected/2c77936f-8b10-43f3-af24-af8fa6c201e1-kube-api-access-gvl52\") pod \"openstack-operator-index-mtvpm\" (UID: \"2c77936f-8b10-43f3-af24-af8fa6c201e1\") " pod="openstack-operators/openstack-operator-index-mtvpm" Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.574138 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.596147 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mtvpm" Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.645059 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:52:45 crc kubenswrapper[4964]: I1004 02:52:45.911079 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mtvpm"] Oct 04 02:52:46 crc kubenswrapper[4964]: I1004 02:52:46.028587 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mtvpm" event={"ID":"2c77936f-8b10-43f3-af24-af8fa6c201e1","Type":"ContainerStarted","Data":"4cf5371181087cf29fa00bbaa7c66d93def2c364ed39427ea202dac661c71b87"} Oct 04 02:52:47 crc kubenswrapper[4964]: I1004 02:52:47.486877 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nr92k"] Oct 04 02:52:47 crc kubenswrapper[4964]: I1004 02:52:47.487455 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" podUID="da04224f-997b-4890-b0c8-2bf983b1f21d" containerName="controller-manager" containerID="cri-o://dde9250f8905fa22eb0e6a9888a34b5013ae5c645a8507bfe8521fa0c4309bdc" gracePeriod=30 Oct 04 02:52:47 crc kubenswrapper[4964]: I1004 02:52:47.583833 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh"] Oct 04 02:52:47 crc kubenswrapper[4964]: I1004 02:52:47.584021 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" podUID="c7b419e3-b339-4a82-8cb1-c14467712c1f" containerName="route-controller-manager" containerID="cri-o://78ec000dcd20eb17f9deb299cce5dbe68a7bbd40ef1808db1f07120a7481c3ab" gracePeriod=30 Oct 04 02:52:48 crc kubenswrapper[4964]: I1004 02:52:48.043413 4964 generic.go:334] "Generic (PLEG): container finished" podID="da04224f-997b-4890-b0c8-2bf983b1f21d" containerID="dde9250f8905fa22eb0e6a9888a34b5013ae5c645a8507bfe8521fa0c4309bdc" exitCode=0 Oct 04 02:52:48 crc kubenswrapper[4964]: I1004 02:52:48.043455 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" event={"ID":"da04224f-997b-4890-b0c8-2bf983b1f21d","Type":"ContainerDied","Data":"dde9250f8905fa22eb0e6a9888a34b5013ae5c645a8507bfe8521fa0c4309bdc"} Oct 04 02:52:48 crc kubenswrapper[4964]: I1004 02:52:48.649531 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mtvpm"] Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.051056 4964 generic.go:334] "Generic (PLEG): container finished" podID="c7b419e3-b339-4a82-8cb1-c14467712c1f" containerID="78ec000dcd20eb17f9deb299cce5dbe68a7bbd40ef1808db1f07120a7481c3ab" exitCode=0 Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.051097 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" event={"ID":"c7b419e3-b339-4a82-8cb1-c14467712c1f","Type":"ContainerDied","Data":"78ec000dcd20eb17f9deb299cce5dbe68a7bbd40ef1808db1f07120a7481c3ab"} Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.267008 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-l59hf"] Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.268214 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-l59hf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.277719 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-l59hf"] Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.370075 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtvrr\" (UniqueName: \"kubernetes.io/projected/a72faf85-343c-4d46-8773-97a366ed031a-kube-api-access-qtvrr\") pod \"openstack-operator-index-l59hf\" (UID: \"a72faf85-343c-4d46-8773-97a366ed031a\") " pod="openstack-operators/openstack-operator-index-l59hf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.471462 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtvrr\" (UniqueName: \"kubernetes.io/projected/a72faf85-343c-4d46-8773-97a366ed031a-kube-api-access-qtvrr\") pod \"openstack-operator-index-l59hf\" (UID: \"a72faf85-343c-4d46-8773-97a366ed031a\") " pod="openstack-operators/openstack-operator-index-l59hf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.506026 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtvrr\" (UniqueName: \"kubernetes.io/projected/a72faf85-343c-4d46-8773-97a366ed031a-kube-api-access-qtvrr\") pod \"openstack-operator-index-l59hf\" (UID: \"a72faf85-343c-4d46-8773-97a366ed031a\") " pod="openstack-operators/openstack-operator-index-l59hf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.535510 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.544846 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.570281 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf"] Oct 04 02:52:49 crc kubenswrapper[4964]: E1004 02:52:49.570557 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b419e3-b339-4a82-8cb1-c14467712c1f" containerName="route-controller-manager" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.570573 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b419e3-b339-4a82-8cb1-c14467712c1f" containerName="route-controller-manager" Oct 04 02:52:49 crc kubenswrapper[4964]: E1004 02:52:49.570585 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da04224f-997b-4890-b0c8-2bf983b1f21d" containerName="controller-manager" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.570594 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="da04224f-997b-4890-b0c8-2bf983b1f21d" containerName="controller-manager" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.571279 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b419e3-b339-4a82-8cb1-c14467712c1f" containerName="route-controller-manager" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.571317 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="da04224f-997b-4890-b0c8-2bf983b1f21d" containerName="controller-manager" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.571820 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.600943 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-l59hf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.605862 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf"] Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673030 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-proxy-ca-bundles\") pod \"da04224f-997b-4890-b0c8-2bf983b1f21d\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673353 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7b419e3-b339-4a82-8cb1-c14467712c1f-client-ca\") pod \"c7b419e3-b339-4a82-8cb1-c14467712c1f\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673388 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-client-ca\") pod \"da04224f-997b-4890-b0c8-2bf983b1f21d\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673426 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7b419e3-b339-4a82-8cb1-c14467712c1f-serving-cert\") pod \"c7b419e3-b339-4a82-8cb1-c14467712c1f\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673481 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7tc4\" (UniqueName: \"kubernetes.io/projected/c7b419e3-b339-4a82-8cb1-c14467712c1f-kube-api-access-j7tc4\") pod \"c7b419e3-b339-4a82-8cb1-c14467712c1f\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673513 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da04224f-997b-4890-b0c8-2bf983b1f21d-serving-cert\") pod \"da04224f-997b-4890-b0c8-2bf983b1f21d\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673531 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7b419e3-b339-4a82-8cb1-c14467712c1f-config\") pod \"c7b419e3-b339-4a82-8cb1-c14467712c1f\" (UID: \"c7b419e3-b339-4a82-8cb1-c14467712c1f\") " Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673555 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-config\") pod \"da04224f-997b-4890-b0c8-2bf983b1f21d\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673575 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-448bq\" (UniqueName: \"kubernetes.io/projected/da04224f-997b-4890-b0c8-2bf983b1f21d-kube-api-access-448bq\") pod \"da04224f-997b-4890-b0c8-2bf983b1f21d\" (UID: \"da04224f-997b-4890-b0c8-2bf983b1f21d\") " Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673725 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e2babd0-af3f-40e6-84ed-19e46619f5c9-client-ca\") pod \"route-controller-manager-8494ffc7-45sdf\" (UID: \"1e2babd0-af3f-40e6-84ed-19e46619f5c9\") " pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673753 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e2babd0-af3f-40e6-84ed-19e46619f5c9-serving-cert\") pod \"route-controller-manager-8494ffc7-45sdf\" (UID: \"1e2babd0-af3f-40e6-84ed-19e46619f5c9\") " pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673787 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c4h9\" (UniqueName: \"kubernetes.io/projected/1e2babd0-af3f-40e6-84ed-19e46619f5c9-kube-api-access-9c4h9\") pod \"route-controller-manager-8494ffc7-45sdf\" (UID: \"1e2babd0-af3f-40e6-84ed-19e46619f5c9\") " pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.673851 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2babd0-af3f-40e6-84ed-19e46619f5c9-config\") pod \"route-controller-manager-8494ffc7-45sdf\" (UID: \"1e2babd0-af3f-40e6-84ed-19e46619f5c9\") " pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.674300 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-client-ca" (OuterVolumeSpecName: "client-ca") pod "da04224f-997b-4890-b0c8-2bf983b1f21d" (UID: "da04224f-997b-4890-b0c8-2bf983b1f21d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.674832 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7b419e3-b339-4a82-8cb1-c14467712c1f-client-ca" (OuterVolumeSpecName: "client-ca") pod "c7b419e3-b339-4a82-8cb1-c14467712c1f" (UID: "c7b419e3-b339-4a82-8cb1-c14467712c1f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.675019 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7b419e3-b339-4a82-8cb1-c14467712c1f-config" (OuterVolumeSpecName: "config") pod "c7b419e3-b339-4a82-8cb1-c14467712c1f" (UID: "c7b419e3-b339-4a82-8cb1-c14467712c1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.675135 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-config" (OuterVolumeSpecName: "config") pod "da04224f-997b-4890-b0c8-2bf983b1f21d" (UID: "da04224f-997b-4890-b0c8-2bf983b1f21d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.675346 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "da04224f-997b-4890-b0c8-2bf983b1f21d" (UID: "da04224f-997b-4890-b0c8-2bf983b1f21d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.677938 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da04224f-997b-4890-b0c8-2bf983b1f21d-kube-api-access-448bq" (OuterVolumeSpecName: "kube-api-access-448bq") pod "da04224f-997b-4890-b0c8-2bf983b1f21d" (UID: "da04224f-997b-4890-b0c8-2bf983b1f21d"). InnerVolumeSpecName "kube-api-access-448bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.678148 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da04224f-997b-4890-b0c8-2bf983b1f21d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "da04224f-997b-4890-b0c8-2bf983b1f21d" (UID: "da04224f-997b-4890-b0c8-2bf983b1f21d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.679031 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b419e3-b339-4a82-8cb1-c14467712c1f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c7b419e3-b339-4a82-8cb1-c14467712c1f" (UID: "c7b419e3-b339-4a82-8cb1-c14467712c1f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.686030 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b419e3-b339-4a82-8cb1-c14467712c1f-kube-api-access-j7tc4" (OuterVolumeSpecName: "kube-api-access-j7tc4") pod "c7b419e3-b339-4a82-8cb1-c14467712c1f" (UID: "c7b419e3-b339-4a82-8cb1-c14467712c1f"). InnerVolumeSpecName "kube-api-access-j7tc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.774907 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c4h9\" (UniqueName: \"kubernetes.io/projected/1e2babd0-af3f-40e6-84ed-19e46619f5c9-kube-api-access-9c4h9\") pod \"route-controller-manager-8494ffc7-45sdf\" (UID: \"1e2babd0-af3f-40e6-84ed-19e46619f5c9\") " pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.775006 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2babd0-af3f-40e6-84ed-19e46619f5c9-config\") pod \"route-controller-manager-8494ffc7-45sdf\" (UID: \"1e2babd0-af3f-40e6-84ed-19e46619f5c9\") " pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.775041 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e2babd0-af3f-40e6-84ed-19e46619f5c9-client-ca\") pod \"route-controller-manager-8494ffc7-45sdf\" (UID: \"1e2babd0-af3f-40e6-84ed-19e46619f5c9\") " pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.775065 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e2babd0-af3f-40e6-84ed-19e46619f5c9-serving-cert\") pod \"route-controller-manager-8494ffc7-45sdf\" (UID: \"1e2babd0-af3f-40e6-84ed-19e46619f5c9\") " pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.775102 4964 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7b419e3-b339-4a82-8cb1-c14467712c1f-client-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.775112 4964 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-client-ca\") on node \"crc\" DevicePath \"\"" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.775121 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7b419e3-b339-4a82-8cb1-c14467712c1f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.775130 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7tc4\" (UniqueName: \"kubernetes.io/projected/c7b419e3-b339-4a82-8cb1-c14467712c1f-kube-api-access-j7tc4\") on node \"crc\" DevicePath \"\"" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.775140 4964 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da04224f-997b-4890-b0c8-2bf983b1f21d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.775149 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7b419e3-b339-4a82-8cb1-c14467712c1f-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.775158 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.775167 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-448bq\" (UniqueName: \"kubernetes.io/projected/da04224f-997b-4890-b0c8-2bf983b1f21d-kube-api-access-448bq\") on node \"crc\" DevicePath \"\"" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.775176 4964 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da04224f-997b-4890-b0c8-2bf983b1f21d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.776907 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2babd0-af3f-40e6-84ed-19e46619f5c9-config\") pod \"route-controller-manager-8494ffc7-45sdf\" (UID: \"1e2babd0-af3f-40e6-84ed-19e46619f5c9\") " pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.776959 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e2babd0-af3f-40e6-84ed-19e46619f5c9-client-ca\") pod \"route-controller-manager-8494ffc7-45sdf\" (UID: \"1e2babd0-af3f-40e6-84ed-19e46619f5c9\") " pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.780894 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e2babd0-af3f-40e6-84ed-19e46619f5c9-serving-cert\") pod \"route-controller-manager-8494ffc7-45sdf\" (UID: \"1e2babd0-af3f-40e6-84ed-19e46619f5c9\") " pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.791400 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c4h9\" (UniqueName: \"kubernetes.io/projected/1e2babd0-af3f-40e6-84ed-19e46619f5c9-kube-api-access-9c4h9\") pod \"route-controller-manager-8494ffc7-45sdf\" (UID: \"1e2babd0-af3f-40e6-84ed-19e46619f5c9\") " pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:49 crc kubenswrapper[4964]: I1004 02:52:49.915816 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.067535 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mtvpm" event={"ID":"2c77936f-8b10-43f3-af24-af8fa6c201e1","Type":"ContainerStarted","Data":"2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8"} Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.067699 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-mtvpm" podUID="2c77936f-8b10-43f3-af24-af8fa6c201e1" containerName="registry-server" containerID="cri-o://2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8" gracePeriod=2 Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.071589 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" event={"ID":"da04224f-997b-4890-b0c8-2bf983b1f21d","Type":"ContainerDied","Data":"7be4ebfdf3a31707aa09ca0a18458bfd27ceded8e8d5fdebf1a2c8e9c7e5f378"} Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.071652 4964 scope.go:117] "RemoveContainer" containerID="dde9250f8905fa22eb0e6a9888a34b5013ae5c645a8507bfe8521fa0c4309bdc" Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.071706 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nr92k" Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.087072 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" event={"ID":"c7b419e3-b339-4a82-8cb1-c14467712c1f","Type":"ContainerDied","Data":"5ca451b1c2e525e4d9fa4481655f762f04b3a6334e4e70a8fce9a60fd3f5e550"} Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.091717 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mtvpm" podStartSLOduration=1.308732288 podStartE2EDuration="5.091693974s" podCreationTimestamp="2025-10-04 02:52:45 +0000 UTC" firstStartedPulling="2025-10-04 02:52:45.913208639 +0000 UTC m=+745.810167287" lastFinishedPulling="2025-10-04 02:52:49.696170335 +0000 UTC m=+749.593128973" observedRunningTime="2025-10-04 02:52:50.081877904 +0000 UTC m=+749.978836582" watchObservedRunningTime="2025-10-04 02:52:50.091693974 +0000 UTC m=+749.988652642" Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.099874 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh" Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.110321 4964 scope.go:117] "RemoveContainer" containerID="78ec000dcd20eb17f9deb299cce5dbe68a7bbd40ef1808db1f07120a7481c3ab" Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.117900 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-l59hf"] Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.155241 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf"] Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.181515 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh"] Oct 04 02:52:50 crc kubenswrapper[4964]: W1004 02:52:50.182863 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e2babd0_af3f_40e6_84ed_19e46619f5c9.slice/crio-121cd9e68d3bae7ff8d25ee551ee37e9115bf5e894926633cbc6e1daeecaa0e7 WatchSource:0}: Error finding container 121cd9e68d3bae7ff8d25ee551ee37e9115bf5e894926633cbc6e1daeecaa0e7: Status 404 returned error can't find the container with id 121cd9e68d3bae7ff8d25ee551ee37e9115bf5e894926633cbc6e1daeecaa0e7 Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.187682 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rm5rh"] Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.191904 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nr92k"] Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.195582 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nr92k"] Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.426950 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mtvpm" Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.575601 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6xh6v" Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.586142 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvl52\" (UniqueName: \"kubernetes.io/projected/2c77936f-8b10-43f3-af24-af8fa6c201e1-kube-api-access-gvl52\") pod \"2c77936f-8b10-43f3-af24-af8fa6c201e1\" (UID: \"2c77936f-8b10-43f3-af24-af8fa6c201e1\") " Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.599938 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c77936f-8b10-43f3-af24-af8fa6c201e1-kube-api-access-gvl52" (OuterVolumeSpecName: "kube-api-access-gvl52") pod "2c77936f-8b10-43f3-af24-af8fa6c201e1" (UID: "2c77936f-8b10-43f3-af24-af8fa6c201e1"). InnerVolumeSpecName "kube-api-access-gvl52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.688130 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvl52\" (UniqueName: \"kubernetes.io/projected/2c77936f-8b10-43f3-af24-af8fa6c201e1-kube-api-access-gvl52\") on node \"crc\" DevicePath \"\"" Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.855450 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b419e3-b339-4a82-8cb1-c14467712c1f" path="/var/lib/kubelet/pods/c7b419e3-b339-4a82-8cb1-c14467712c1f/volumes" Oct 04 02:52:50 crc kubenswrapper[4964]: I1004 02:52:50.856270 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da04224f-997b-4890-b0c8-2bf983b1f21d" path="/var/lib/kubelet/pods/da04224f-997b-4890-b0c8-2bf983b1f21d/volumes" Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.097780 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" event={"ID":"1e2babd0-af3f-40e6-84ed-19e46619f5c9","Type":"ContainerStarted","Data":"08823033b38d585b8861d97dd2bc1638346e3a31846e17ff1c19200eae8d2922"} Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.097831 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" event={"ID":"1e2babd0-af3f-40e6-84ed-19e46619f5c9","Type":"ContainerStarted","Data":"121cd9e68d3bae7ff8d25ee551ee37e9115bf5e894926633cbc6e1daeecaa0e7"} Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.099712 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-l59hf" event={"ID":"a72faf85-343c-4d46-8773-97a366ed031a","Type":"ContainerStarted","Data":"6ec69f52267f05521bc6751b0135c6498b18c1cd2fb1e846bfa41f2b8881f0b9"} Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.099746 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-l59hf" event={"ID":"a72faf85-343c-4d46-8773-97a366ed031a","Type":"ContainerStarted","Data":"fe2b320c50f7e2d275c38580bd733c06d793c098937a82a87a71afada608314f"} Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.102673 4964 generic.go:334] "Generic (PLEG): container finished" podID="2c77936f-8b10-43f3-af24-af8fa6c201e1" containerID="2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8" exitCode=0 Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.102725 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mtvpm" event={"ID":"2c77936f-8b10-43f3-af24-af8fa6c201e1","Type":"ContainerDied","Data":"2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8"} Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.102781 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mtvpm" event={"ID":"2c77936f-8b10-43f3-af24-af8fa6c201e1","Type":"ContainerDied","Data":"4cf5371181087cf29fa00bbaa7c66d93def2c364ed39427ea202dac661c71b87"} Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.102777 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mtvpm" Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.102810 4964 scope.go:117] "RemoveContainer" containerID="2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8" Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.132440 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" podStartSLOduration=3.132416034 podStartE2EDuration="3.132416034s" podCreationTimestamp="2025-10-04 02:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:52:51.130460473 +0000 UTC m=+751.027419121" watchObservedRunningTime="2025-10-04 02:52:51.132416034 +0000 UTC m=+751.029374702" Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.133953 4964 scope.go:117] "RemoveContainer" containerID="2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8" Oct 04 02:52:51 crc kubenswrapper[4964]: E1004 02:52:51.134561 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8\": container with ID starting with 2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8 not found: ID does not exist" containerID="2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8" Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.134605 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8"} err="failed to get container status \"2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8\": rpc error: code = NotFound desc = could not find container \"2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8\": container with ID starting with 2a91eb535ffb4a4935e4bdc1e40c142856d03c97997b6ce06dda05331756a3f8 not found: ID does not exist" Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.166522 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-l59hf" podStartSLOduration=2.073142987 podStartE2EDuration="2.166494716s" podCreationTimestamp="2025-10-04 02:52:49 +0000 UTC" firstStartedPulling="2025-10-04 02:52:50.119880549 +0000 UTC m=+750.016839197" lastFinishedPulling="2025-10-04 02:52:50.213232288 +0000 UTC m=+750.110190926" observedRunningTime="2025-10-04 02:52:51.155195538 +0000 UTC m=+751.052154216" watchObservedRunningTime="2025-10-04 02:52:51.166494716 +0000 UTC m=+751.063453394" Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.174324 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mtvpm"] Oct 04 02:52:51 crc kubenswrapper[4964]: I1004 02:52:51.180657 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-mtvpm"] Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.115577 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.132862 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8494ffc7-45sdf" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.494931 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6"] Oct 04 02:52:52 crc kubenswrapper[4964]: E1004 02:52:52.495555 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c77936f-8b10-43f3-af24-af8fa6c201e1" containerName="registry-server" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.495576 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c77936f-8b10-43f3-af24-af8fa6c201e1" containerName="registry-server" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.495822 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c77936f-8b10-43f3-af24-af8fa6c201e1" containerName="registry-server" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.496465 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.501548 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.502177 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.502451 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.502515 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.502728 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.502942 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.516502 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6"] Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.516582 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.621715 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-proxy-ca-bundles\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.621771 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-serving-cert\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.621856 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq8qs\" (UniqueName: \"kubernetes.io/projected/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-kube-api-access-fq8qs\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.621922 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-config\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.621961 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-client-ca\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.722666 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq8qs\" (UniqueName: \"kubernetes.io/projected/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-kube-api-access-fq8qs\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.722732 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-config\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.722768 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-client-ca\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.722879 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-proxy-ca-bundles\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.722915 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-serving-cert\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.724434 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-client-ca\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.725141 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-config\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.725545 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-proxy-ca-bundles\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.730698 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-serving-cert\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.741119 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq8qs\" (UniqueName: \"kubernetes.io/projected/6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab-kube-api-access-fq8qs\") pod \"controller-manager-98cf9bfb8-vk8t6\" (UID: \"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab\") " pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.814924 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:52 crc kubenswrapper[4964]: I1004 02:52:52.856253 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c77936f-8b10-43f3-af24-af8fa6c201e1" path="/var/lib/kubelet/pods/2c77936f-8b10-43f3-af24-af8fa6c201e1/volumes" Oct 04 02:52:53 crc kubenswrapper[4964]: I1004 02:52:53.103276 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6"] Oct 04 02:52:53 crc kubenswrapper[4964]: I1004 02:52:53.129836 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" event={"ID":"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab","Type":"ContainerStarted","Data":"1bb7a98431821475d1486e2db6349efe75de6556945a6742dc6cff44cb7c5ffa"} Oct 04 02:52:54 crc kubenswrapper[4964]: I1004 02:52:54.139495 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" event={"ID":"6fcd6e21-eaeb-467b-8ef7-26c0577cc0ab","Type":"ContainerStarted","Data":"72481368103d89e8ba5bbbb1db43c37060d8bb51186eae32ad95013e08ec7a39"} Oct 04 02:52:54 crc kubenswrapper[4964]: I1004 02:52:54.139905 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:54 crc kubenswrapper[4964]: I1004 02:52:54.148408 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" Oct 04 02:52:54 crc kubenswrapper[4964]: I1004 02:52:54.159449 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-98cf9bfb8-vk8t6" podStartSLOduration=6.1594213700000005 podStartE2EDuration="6.15942137s" podCreationTimestamp="2025-10-04 02:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:52:54.156555584 +0000 UTC m=+754.053514272" watchObservedRunningTime="2025-10-04 02:52:54.15942137 +0000 UTC m=+754.056380048" Oct 04 02:52:55 crc kubenswrapper[4964]: I1004 02:52:55.103609 4964 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 04 02:52:59 crc kubenswrapper[4964]: I1004 02:52:59.601970 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-l59hf" Oct 04 02:52:59 crc kubenswrapper[4964]: I1004 02:52:59.602966 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-l59hf" Oct 04 02:52:59 crc kubenswrapper[4964]: I1004 02:52:59.645947 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-l59hf" Oct 04 02:53:00 crc kubenswrapper[4964]: I1004 02:53:00.223746 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-l59hf" Oct 04 02:53:00 crc kubenswrapper[4964]: I1004 02:53:00.577874 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-fs4w8" Oct 04 02:53:04 crc kubenswrapper[4964]: I1004 02:53:04.449812 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:53:04 crc kubenswrapper[4964]: I1004 02:53:04.450371 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.367687 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8"] Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.369840 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.372152 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-x5jws" Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.395607 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8"] Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.448545 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f508dab2-f748-4821-a9c9-c06405b3ecd5-bundle\") pod \"22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8\" (UID: \"f508dab2-f748-4821-a9c9-c06405b3ecd5\") " pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.448748 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f508dab2-f748-4821-a9c9-c06405b3ecd5-util\") pod \"22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8\" (UID: \"f508dab2-f748-4821-a9c9-c06405b3ecd5\") " pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.448790 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crfdf\" (UniqueName: \"kubernetes.io/projected/f508dab2-f748-4821-a9c9-c06405b3ecd5-kube-api-access-crfdf\") pod \"22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8\" (UID: \"f508dab2-f748-4821-a9c9-c06405b3ecd5\") " pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.550596 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f508dab2-f748-4821-a9c9-c06405b3ecd5-bundle\") pod \"22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8\" (UID: \"f508dab2-f748-4821-a9c9-c06405b3ecd5\") " pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.550707 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f508dab2-f748-4821-a9c9-c06405b3ecd5-util\") pod \"22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8\" (UID: \"f508dab2-f748-4821-a9c9-c06405b3ecd5\") " pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.550734 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crfdf\" (UniqueName: \"kubernetes.io/projected/f508dab2-f748-4821-a9c9-c06405b3ecd5-kube-api-access-crfdf\") pod \"22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8\" (UID: \"f508dab2-f748-4821-a9c9-c06405b3ecd5\") " pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.551369 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f508dab2-f748-4821-a9c9-c06405b3ecd5-util\") pod \"22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8\" (UID: \"f508dab2-f748-4821-a9c9-c06405b3ecd5\") " pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.551370 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f508dab2-f748-4821-a9c9-c06405b3ecd5-bundle\") pod \"22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8\" (UID: \"f508dab2-f748-4821-a9c9-c06405b3ecd5\") " pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.571423 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crfdf\" (UniqueName: \"kubernetes.io/projected/f508dab2-f748-4821-a9c9-c06405b3ecd5-kube-api-access-crfdf\") pod \"22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8\" (UID: \"f508dab2-f748-4821-a9c9-c06405b3ecd5\") " pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:07 crc kubenswrapper[4964]: I1004 02:53:07.702898 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:08 crc kubenswrapper[4964]: I1004 02:53:08.171410 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8"] Oct 04 02:53:08 crc kubenswrapper[4964]: I1004 02:53:08.258098 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" event={"ID":"f508dab2-f748-4821-a9c9-c06405b3ecd5","Type":"ContainerStarted","Data":"630f4bdb12be4d007de6eaeee4c5a9370d8e0787f559bda0d88dbc8980b09fbe"} Oct 04 02:53:09 crc kubenswrapper[4964]: I1004 02:53:09.274833 4964 generic.go:334] "Generic (PLEG): container finished" podID="f508dab2-f748-4821-a9c9-c06405b3ecd5" containerID="fcf5a20595973c94b0be8254a8047c7c13bd13bc26819ccd1764b11b17270e89" exitCode=0 Oct 04 02:53:09 crc kubenswrapper[4964]: I1004 02:53:09.274947 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" event={"ID":"f508dab2-f748-4821-a9c9-c06405b3ecd5","Type":"ContainerDied","Data":"fcf5a20595973c94b0be8254a8047c7c13bd13bc26819ccd1764b11b17270e89"} Oct 04 02:53:10 crc kubenswrapper[4964]: I1004 02:53:10.286902 4964 generic.go:334] "Generic (PLEG): container finished" podID="f508dab2-f748-4821-a9c9-c06405b3ecd5" containerID="d94f6193a0d237619bdfb25f3b79d5e36919e163f78d118f1d5b09fe33da4d26" exitCode=0 Oct 04 02:53:10 crc kubenswrapper[4964]: I1004 02:53:10.287077 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" event={"ID":"f508dab2-f748-4821-a9c9-c06405b3ecd5","Type":"ContainerDied","Data":"d94f6193a0d237619bdfb25f3b79d5e36919e163f78d118f1d5b09fe33da4d26"} Oct 04 02:53:11 crc kubenswrapper[4964]: I1004 02:53:11.297647 4964 generic.go:334] "Generic (PLEG): container finished" podID="f508dab2-f748-4821-a9c9-c06405b3ecd5" containerID="2cf708e4a078f8b1bb13b13255da12813602dedabc7be13f2a75c9464b8cf216" exitCode=0 Oct 04 02:53:11 crc kubenswrapper[4964]: I1004 02:53:11.297696 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" event={"ID":"f508dab2-f748-4821-a9c9-c06405b3ecd5","Type":"ContainerDied","Data":"2cf708e4a078f8b1bb13b13255da12813602dedabc7be13f2a75c9464b8cf216"} Oct 04 02:53:12 crc kubenswrapper[4964]: I1004 02:53:12.822358 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:12 crc kubenswrapper[4964]: I1004 02:53:12.930727 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f508dab2-f748-4821-a9c9-c06405b3ecd5-util\") pod \"f508dab2-f748-4821-a9c9-c06405b3ecd5\" (UID: \"f508dab2-f748-4821-a9c9-c06405b3ecd5\") " Oct 04 02:53:12 crc kubenswrapper[4964]: I1004 02:53:12.931274 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crfdf\" (UniqueName: \"kubernetes.io/projected/f508dab2-f748-4821-a9c9-c06405b3ecd5-kube-api-access-crfdf\") pod \"f508dab2-f748-4821-a9c9-c06405b3ecd5\" (UID: \"f508dab2-f748-4821-a9c9-c06405b3ecd5\") " Oct 04 02:53:12 crc kubenswrapper[4964]: I1004 02:53:12.931333 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f508dab2-f748-4821-a9c9-c06405b3ecd5-bundle\") pod \"f508dab2-f748-4821-a9c9-c06405b3ecd5\" (UID: \"f508dab2-f748-4821-a9c9-c06405b3ecd5\") " Oct 04 02:53:12 crc kubenswrapper[4964]: I1004 02:53:12.932061 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f508dab2-f748-4821-a9c9-c06405b3ecd5-bundle" (OuterVolumeSpecName: "bundle") pod "f508dab2-f748-4821-a9c9-c06405b3ecd5" (UID: "f508dab2-f748-4821-a9c9-c06405b3ecd5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:53:12 crc kubenswrapper[4964]: I1004 02:53:12.932487 4964 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f508dab2-f748-4821-a9c9-c06405b3ecd5-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:53:12 crc kubenswrapper[4964]: I1004 02:53:12.938519 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f508dab2-f748-4821-a9c9-c06405b3ecd5-kube-api-access-crfdf" (OuterVolumeSpecName: "kube-api-access-crfdf") pod "f508dab2-f748-4821-a9c9-c06405b3ecd5" (UID: "f508dab2-f748-4821-a9c9-c06405b3ecd5"). InnerVolumeSpecName "kube-api-access-crfdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:53:12 crc kubenswrapper[4964]: I1004 02:53:12.966516 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f508dab2-f748-4821-a9c9-c06405b3ecd5-util" (OuterVolumeSpecName: "util") pod "f508dab2-f748-4821-a9c9-c06405b3ecd5" (UID: "f508dab2-f748-4821-a9c9-c06405b3ecd5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:53:13 crc kubenswrapper[4964]: I1004 02:53:13.034393 4964 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f508dab2-f748-4821-a9c9-c06405b3ecd5-util\") on node \"crc\" DevicePath \"\"" Oct 04 02:53:13 crc kubenswrapper[4964]: I1004 02:53:13.034456 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crfdf\" (UniqueName: \"kubernetes.io/projected/f508dab2-f748-4821-a9c9-c06405b3ecd5-kube-api-access-crfdf\") on node \"crc\" DevicePath \"\"" Oct 04 02:53:13 crc kubenswrapper[4964]: I1004 02:53:13.314788 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" event={"ID":"f508dab2-f748-4821-a9c9-c06405b3ecd5","Type":"ContainerDied","Data":"630f4bdb12be4d007de6eaeee4c5a9370d8e0787f559bda0d88dbc8980b09fbe"} Oct 04 02:53:13 crc kubenswrapper[4964]: I1004 02:53:13.314863 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="630f4bdb12be4d007de6eaeee4c5a9370d8e0787f559bda0d88dbc8980b09fbe" Oct 04 02:53:13 crc kubenswrapper[4964]: I1004 02:53:13.314987 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.072825 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft"] Oct 04 02:53:20 crc kubenswrapper[4964]: E1004 02:53:20.073738 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f508dab2-f748-4821-a9c9-c06405b3ecd5" containerName="util" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.073756 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f508dab2-f748-4821-a9c9-c06405b3ecd5" containerName="util" Oct 04 02:53:20 crc kubenswrapper[4964]: E1004 02:53:20.073777 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f508dab2-f748-4821-a9c9-c06405b3ecd5" containerName="pull" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.073786 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f508dab2-f748-4821-a9c9-c06405b3ecd5" containerName="pull" Oct 04 02:53:20 crc kubenswrapper[4964]: E1004 02:53:20.073799 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f508dab2-f748-4821-a9c9-c06405b3ecd5" containerName="extract" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.073808 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f508dab2-f748-4821-a9c9-c06405b3ecd5" containerName="extract" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.073974 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="f508dab2-f748-4821-a9c9-c06405b3ecd5" containerName="extract" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.074765 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.076442 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-5q22n" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.101083 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft"] Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.236989 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr8bz\" (UniqueName: \"kubernetes.io/projected/939d1a8b-d019-424e-bf3a-601213f46341-kube-api-access-xr8bz\") pod \"openstack-operator-controller-operator-5798c59cb8-rgpft\" (UID: \"939d1a8b-d019-424e-bf3a-601213f46341\") " pod="openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.338130 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr8bz\" (UniqueName: \"kubernetes.io/projected/939d1a8b-d019-424e-bf3a-601213f46341-kube-api-access-xr8bz\") pod \"openstack-operator-controller-operator-5798c59cb8-rgpft\" (UID: \"939d1a8b-d019-424e-bf3a-601213f46341\") " pod="openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.361076 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr8bz\" (UniqueName: \"kubernetes.io/projected/939d1a8b-d019-424e-bf3a-601213f46341-kube-api-access-xr8bz\") pod \"openstack-operator-controller-operator-5798c59cb8-rgpft\" (UID: \"939d1a8b-d019-424e-bf3a-601213f46341\") " pod="openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.393992 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.662505 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cpdhg"] Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.663776 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.684368 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpdhg"] Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.744357 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-utilities\") pod \"redhat-marketplace-cpdhg\" (UID: \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\") " pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.744451 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h7lc\" (UniqueName: \"kubernetes.io/projected/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-kube-api-access-6h7lc\") pod \"redhat-marketplace-cpdhg\" (UID: \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\") " pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.744474 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-catalog-content\") pod \"redhat-marketplace-cpdhg\" (UID: \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\") " pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.775323 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft"] Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.851086 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h7lc\" (UniqueName: \"kubernetes.io/projected/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-kube-api-access-6h7lc\") pod \"redhat-marketplace-cpdhg\" (UID: \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\") " pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.851124 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-catalog-content\") pod \"redhat-marketplace-cpdhg\" (UID: \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\") " pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.851177 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-utilities\") pod \"redhat-marketplace-cpdhg\" (UID: \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\") " pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.851957 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-utilities\") pod \"redhat-marketplace-cpdhg\" (UID: \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\") " pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.852670 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-catalog-content\") pod \"redhat-marketplace-cpdhg\" (UID: \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\") " pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.874876 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h7lc\" (UniqueName: \"kubernetes.io/projected/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-kube-api-access-6h7lc\") pod \"redhat-marketplace-cpdhg\" (UID: \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\") " pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:20 crc kubenswrapper[4964]: I1004 02:53:20.997276 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:21 crc kubenswrapper[4964]: I1004 02:53:21.380137 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft" event={"ID":"939d1a8b-d019-424e-bf3a-601213f46341","Type":"ContainerStarted","Data":"f621eba440c94608687b15880d1e49fe298526b29374cba263dc92058a165cf3"} Oct 04 02:53:21 crc kubenswrapper[4964]: I1004 02:53:21.432425 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpdhg"] Oct 04 02:53:22 crc kubenswrapper[4964]: I1004 02:53:22.386888 4964 generic.go:334] "Generic (PLEG): container finished" podID="96c946dd-9fb6-4b72-8df1-cf5ae54e856b" containerID="60f4b675d6bd603a752d80e0c87195fe12d6b339f0a132750f425a31ea3d2ae8" exitCode=0 Oct 04 02:53:22 crc kubenswrapper[4964]: I1004 02:53:22.387005 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpdhg" event={"ID":"96c946dd-9fb6-4b72-8df1-cf5ae54e856b","Type":"ContainerDied","Data":"60f4b675d6bd603a752d80e0c87195fe12d6b339f0a132750f425a31ea3d2ae8"} Oct 04 02:53:22 crc kubenswrapper[4964]: I1004 02:53:22.388104 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpdhg" event={"ID":"96c946dd-9fb6-4b72-8df1-cf5ae54e856b","Type":"ContainerStarted","Data":"40de5f1698855c446c89cf7b4b20b648806affe3c207060a1e67732384306d97"} Oct 04 02:53:25 crc kubenswrapper[4964]: I1004 02:53:25.421031 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft" event={"ID":"939d1a8b-d019-424e-bf3a-601213f46341","Type":"ContainerStarted","Data":"08cd2b35418a9467531304094c6ba154bd27e8bce55899c70e5dbe7d02d93cff"} Oct 04 02:53:25 crc kubenswrapper[4964]: I1004 02:53:25.427277 4964 generic.go:334] "Generic (PLEG): container finished" podID="96c946dd-9fb6-4b72-8df1-cf5ae54e856b" containerID="f41c84133cf10b92a96943a1f94e5bf9807a7cc90f0d8f9e0c94f9539219f8ec" exitCode=0 Oct 04 02:53:25 crc kubenswrapper[4964]: I1004 02:53:25.427341 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpdhg" event={"ID":"96c946dd-9fb6-4b72-8df1-cf5ae54e856b","Type":"ContainerDied","Data":"f41c84133cf10b92a96943a1f94e5bf9807a7cc90f0d8f9e0c94f9539219f8ec"} Oct 04 02:53:27 crc kubenswrapper[4964]: I1004 02:53:27.444636 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpdhg" event={"ID":"96c946dd-9fb6-4b72-8df1-cf5ae54e856b","Type":"ContainerStarted","Data":"68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f"} Oct 04 02:53:27 crc kubenswrapper[4964]: I1004 02:53:27.447414 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft" event={"ID":"939d1a8b-d019-424e-bf3a-601213f46341","Type":"ContainerStarted","Data":"89178bf85bc437b885e081d32ce9ff7acfedd7ee6119af7c1dbe2b3742b4e7ed"} Oct 04 02:53:27 crc kubenswrapper[4964]: I1004 02:53:27.448106 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft" Oct 04 02:53:27 crc kubenswrapper[4964]: I1004 02:53:27.478549 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cpdhg" podStartSLOduration=3.924967065 podStartE2EDuration="7.478526434s" podCreationTimestamp="2025-10-04 02:53:20 +0000 UTC" firstStartedPulling="2025-10-04 02:53:23.393388797 +0000 UTC m=+783.290347435" lastFinishedPulling="2025-10-04 02:53:26.946948156 +0000 UTC m=+786.843906804" observedRunningTime="2025-10-04 02:53:27.474747464 +0000 UTC m=+787.371706162" watchObservedRunningTime="2025-10-04 02:53:27.478526434 +0000 UTC m=+787.375485102" Oct 04 02:53:27 crc kubenswrapper[4964]: I1004 02:53:27.527234 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft" podStartSLOduration=1.363795469 podStartE2EDuration="7.527211454s" podCreationTimestamp="2025-10-04 02:53:20 +0000 UTC" firstStartedPulling="2025-10-04 02:53:20.784092496 +0000 UTC m=+780.681051134" lastFinishedPulling="2025-10-04 02:53:26.947508451 +0000 UTC m=+786.844467119" observedRunningTime="2025-10-04 02:53:27.522687866 +0000 UTC m=+787.419646524" watchObservedRunningTime="2025-10-04 02:53:27.527211454 +0000 UTC m=+787.424170102" Oct 04 02:53:29 crc kubenswrapper[4964]: I1004 02:53:29.466314 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5798c59cb8-rgpft" Oct 04 02:53:30 crc kubenswrapper[4964]: I1004 02:53:30.997751 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:30 crc kubenswrapper[4964]: I1004 02:53:30.998090 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:31 crc kubenswrapper[4964]: I1004 02:53:31.072057 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:34 crc kubenswrapper[4964]: I1004 02:53:34.449189 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:53:34 crc kubenswrapper[4964]: I1004 02:53:34.449548 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:53:34 crc kubenswrapper[4964]: I1004 02:53:34.449610 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:53:34 crc kubenswrapper[4964]: I1004 02:53:34.450419 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6efe9a8f74bbf3944c47eb916499cc67675937487bfe4fb926abac0853174b18"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 02:53:34 crc kubenswrapper[4964]: I1004 02:53:34.450497 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://6efe9a8f74bbf3944c47eb916499cc67675937487bfe4fb926abac0853174b18" gracePeriod=600 Oct 04 02:53:35 crc kubenswrapper[4964]: I1004 02:53:35.512526 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="6efe9a8f74bbf3944c47eb916499cc67675937487bfe4fb926abac0853174b18" exitCode=0 Oct 04 02:53:35 crc kubenswrapper[4964]: I1004 02:53:35.512638 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"6efe9a8f74bbf3944c47eb916499cc67675937487bfe4fb926abac0853174b18"} Oct 04 02:53:35 crc kubenswrapper[4964]: I1004 02:53:35.512998 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"008a6133f8963f8c25283a4615f3f65b17e14a1929e0bda2e812a4ec5ec09c24"} Oct 04 02:53:35 crc kubenswrapper[4964]: I1004 02:53:35.513030 4964 scope.go:117] "RemoveContainer" containerID="1137f47c0aa6462aae7846003655bf95319654803dc6e0eaf3afe56be26f6389" Oct 04 02:53:41 crc kubenswrapper[4964]: I1004 02:53:41.070545 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:41 crc kubenswrapper[4964]: I1004 02:53:41.127223 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpdhg"] Oct 04 02:53:41 crc kubenswrapper[4964]: I1004 02:53:41.556791 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cpdhg" podUID="96c946dd-9fb6-4b72-8df1-cf5ae54e856b" containerName="registry-server" containerID="cri-o://68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f" gracePeriod=2 Oct 04 02:53:41 crc kubenswrapper[4964]: I1004 02:53:41.948047 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.124844 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-catalog-content\") pod \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\" (UID: \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\") " Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.124970 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-utilities\") pod \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\" (UID: \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\") " Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.125299 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h7lc\" (UniqueName: \"kubernetes.io/projected/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-kube-api-access-6h7lc\") pod \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\" (UID: \"96c946dd-9fb6-4b72-8df1-cf5ae54e856b\") " Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.126373 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-utilities" (OuterVolumeSpecName: "utilities") pod "96c946dd-9fb6-4b72-8df1-cf5ae54e856b" (UID: "96c946dd-9fb6-4b72-8df1-cf5ae54e856b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.127430 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.135818 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-kube-api-access-6h7lc" (OuterVolumeSpecName: "kube-api-access-6h7lc") pod "96c946dd-9fb6-4b72-8df1-cf5ae54e856b" (UID: "96c946dd-9fb6-4b72-8df1-cf5ae54e856b"). InnerVolumeSpecName "kube-api-access-6h7lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.149877 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96c946dd-9fb6-4b72-8df1-cf5ae54e856b" (UID: "96c946dd-9fb6-4b72-8df1-cf5ae54e856b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.229369 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.229414 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h7lc\" (UniqueName: \"kubernetes.io/projected/96c946dd-9fb6-4b72-8df1-cf5ae54e856b-kube-api-access-6h7lc\") on node \"crc\" DevicePath \"\"" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.569539 4964 generic.go:334] "Generic (PLEG): container finished" podID="96c946dd-9fb6-4b72-8df1-cf5ae54e856b" containerID="68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f" exitCode=0 Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.569668 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpdhg" event={"ID":"96c946dd-9fb6-4b72-8df1-cf5ae54e856b","Type":"ContainerDied","Data":"68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f"} Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.569756 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cpdhg" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.569835 4964 scope.go:117] "RemoveContainer" containerID="68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.569801 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpdhg" event={"ID":"96c946dd-9fb6-4b72-8df1-cf5ae54e856b","Type":"ContainerDied","Data":"40de5f1698855c446c89cf7b4b20b648806affe3c207060a1e67732384306d97"} Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.598670 4964 scope.go:117] "RemoveContainer" containerID="f41c84133cf10b92a96943a1f94e5bf9807a7cc90f0d8f9e0c94f9539219f8ec" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.622512 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpdhg"] Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.629200 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpdhg"] Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.645713 4964 scope.go:117] "RemoveContainer" containerID="60f4b675d6bd603a752d80e0c87195fe12d6b339f0a132750f425a31ea3d2ae8" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.667016 4964 scope.go:117] "RemoveContainer" containerID="68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f" Oct 04 02:53:42 crc kubenswrapper[4964]: E1004 02:53:42.667581 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f\": container with ID starting with 68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f not found: ID does not exist" containerID="68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.667753 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f"} err="failed to get container status \"68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f\": rpc error: code = NotFound desc = could not find container \"68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f\": container with ID starting with 68b3afdf8dbcd2d150e269a2bd753c535e9298e166bc4f8e6e78b3425eb0320f not found: ID does not exist" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.667889 4964 scope.go:117] "RemoveContainer" containerID="f41c84133cf10b92a96943a1f94e5bf9807a7cc90f0d8f9e0c94f9539219f8ec" Oct 04 02:53:42 crc kubenswrapper[4964]: E1004 02:53:42.668871 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f41c84133cf10b92a96943a1f94e5bf9807a7cc90f0d8f9e0c94f9539219f8ec\": container with ID starting with f41c84133cf10b92a96943a1f94e5bf9807a7cc90f0d8f9e0c94f9539219f8ec not found: ID does not exist" containerID="f41c84133cf10b92a96943a1f94e5bf9807a7cc90f0d8f9e0c94f9539219f8ec" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.668964 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41c84133cf10b92a96943a1f94e5bf9807a7cc90f0d8f9e0c94f9539219f8ec"} err="failed to get container status \"f41c84133cf10b92a96943a1f94e5bf9807a7cc90f0d8f9e0c94f9539219f8ec\": rpc error: code = NotFound desc = could not find container \"f41c84133cf10b92a96943a1f94e5bf9807a7cc90f0d8f9e0c94f9539219f8ec\": container with ID starting with f41c84133cf10b92a96943a1f94e5bf9807a7cc90f0d8f9e0c94f9539219f8ec not found: ID does not exist" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.669012 4964 scope.go:117] "RemoveContainer" containerID="60f4b675d6bd603a752d80e0c87195fe12d6b339f0a132750f425a31ea3d2ae8" Oct 04 02:53:42 crc kubenswrapper[4964]: E1004 02:53:42.669433 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f4b675d6bd603a752d80e0c87195fe12d6b339f0a132750f425a31ea3d2ae8\": container with ID starting with 60f4b675d6bd603a752d80e0c87195fe12d6b339f0a132750f425a31ea3d2ae8 not found: ID does not exist" containerID="60f4b675d6bd603a752d80e0c87195fe12d6b339f0a132750f425a31ea3d2ae8" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.669567 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f4b675d6bd603a752d80e0c87195fe12d6b339f0a132750f425a31ea3d2ae8"} err="failed to get container status \"60f4b675d6bd603a752d80e0c87195fe12d6b339f0a132750f425a31ea3d2ae8\": rpc error: code = NotFound desc = could not find container \"60f4b675d6bd603a752d80e0c87195fe12d6b339f0a132750f425a31ea3d2ae8\": container with ID starting with 60f4b675d6bd603a752d80e0c87195fe12d6b339f0a132750f425a31ea3d2ae8 not found: ID does not exist" Oct 04 02:53:42 crc kubenswrapper[4964]: I1004 02:53:42.860432 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c946dd-9fb6-4b72-8df1-cf5ae54e856b" path="/var/lib/kubelet/pods/96c946dd-9fb6-4b72-8df1-cf5ae54e856b/volumes" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.214022 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4nhk4"] Oct 04 02:53:50 crc kubenswrapper[4964]: E1004 02:53:50.215462 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c946dd-9fb6-4b72-8df1-cf5ae54e856b" containerName="extract-utilities" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.215508 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c946dd-9fb6-4b72-8df1-cf5ae54e856b" containerName="extract-utilities" Oct 04 02:53:50 crc kubenswrapper[4964]: E1004 02:53:50.215537 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c946dd-9fb6-4b72-8df1-cf5ae54e856b" containerName="registry-server" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.215551 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c946dd-9fb6-4b72-8df1-cf5ae54e856b" containerName="registry-server" Oct 04 02:53:50 crc kubenswrapper[4964]: E1004 02:53:50.215580 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c946dd-9fb6-4b72-8df1-cf5ae54e856b" containerName="extract-content" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.215591 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c946dd-9fb6-4b72-8df1-cf5ae54e856b" containerName="extract-content" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.215859 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c946dd-9fb6-4b72-8df1-cf5ae54e856b" containerName="registry-server" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.217357 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.228657 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4nhk4"] Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.268869 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5jnf\" (UniqueName: \"kubernetes.io/projected/f0ac9809-0e8c-4579-813a-9acacca251d5-kube-api-access-c5jnf\") pod \"certified-operators-4nhk4\" (UID: \"f0ac9809-0e8c-4579-813a-9acacca251d5\") " pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.268931 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ac9809-0e8c-4579-813a-9acacca251d5-catalog-content\") pod \"certified-operators-4nhk4\" (UID: \"f0ac9809-0e8c-4579-813a-9acacca251d5\") " pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.268991 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ac9809-0e8c-4579-813a-9acacca251d5-utilities\") pod \"certified-operators-4nhk4\" (UID: \"f0ac9809-0e8c-4579-813a-9acacca251d5\") " pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.370604 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5jnf\" (UniqueName: \"kubernetes.io/projected/f0ac9809-0e8c-4579-813a-9acacca251d5-kube-api-access-c5jnf\") pod \"certified-operators-4nhk4\" (UID: \"f0ac9809-0e8c-4579-813a-9acacca251d5\") " pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.370788 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ac9809-0e8c-4579-813a-9acacca251d5-catalog-content\") pod \"certified-operators-4nhk4\" (UID: \"f0ac9809-0e8c-4579-813a-9acacca251d5\") " pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.370845 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ac9809-0e8c-4579-813a-9acacca251d5-utilities\") pod \"certified-operators-4nhk4\" (UID: \"f0ac9809-0e8c-4579-813a-9acacca251d5\") " pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.371714 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ac9809-0e8c-4579-813a-9acacca251d5-catalog-content\") pod \"certified-operators-4nhk4\" (UID: \"f0ac9809-0e8c-4579-813a-9acacca251d5\") " pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.371750 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ac9809-0e8c-4579-813a-9acacca251d5-utilities\") pod \"certified-operators-4nhk4\" (UID: \"f0ac9809-0e8c-4579-813a-9acacca251d5\") " pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.402335 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5jnf\" (UniqueName: \"kubernetes.io/projected/f0ac9809-0e8c-4579-813a-9acacca251d5-kube-api-access-c5jnf\") pod \"certified-operators-4nhk4\" (UID: \"f0ac9809-0e8c-4579-813a-9acacca251d5\") " pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:53:50 crc kubenswrapper[4964]: I1004 02:53:50.547166 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:53:51 crc kubenswrapper[4964]: I1004 02:53:51.019237 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4nhk4"] Oct 04 02:53:51 crc kubenswrapper[4964]: I1004 02:53:51.654069 4964 generic.go:334] "Generic (PLEG): container finished" podID="f0ac9809-0e8c-4579-813a-9acacca251d5" containerID="d7cf1820d5654d8daabf3e2a9ba455b663b94275811cc12170f6c0989ae0d3e6" exitCode=0 Oct 04 02:53:51 crc kubenswrapper[4964]: I1004 02:53:51.654128 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhk4" event={"ID":"f0ac9809-0e8c-4579-813a-9acacca251d5","Type":"ContainerDied","Data":"d7cf1820d5654d8daabf3e2a9ba455b663b94275811cc12170f6c0989ae0d3e6"} Oct 04 02:53:51 crc kubenswrapper[4964]: I1004 02:53:51.654485 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhk4" event={"ID":"f0ac9809-0e8c-4579-813a-9acacca251d5","Type":"ContainerStarted","Data":"29d2a4d578a4ccd5dceacdeaf9873b326a1a40e7da0c8a7377d6d4b531f84767"} Oct 04 02:53:52 crc kubenswrapper[4964]: I1004 02:53:52.663351 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhk4" event={"ID":"f0ac9809-0e8c-4579-813a-9acacca251d5","Type":"ContainerStarted","Data":"ded72e26705cda3cc1edd32107058cbbd402456893b7bdb1287ddb5c072285e8"} Oct 04 02:53:53 crc kubenswrapper[4964]: I1004 02:53:53.674637 4964 generic.go:334] "Generic (PLEG): container finished" podID="f0ac9809-0e8c-4579-813a-9acacca251d5" containerID="ded72e26705cda3cc1edd32107058cbbd402456893b7bdb1287ddb5c072285e8" exitCode=0 Oct 04 02:53:53 crc kubenswrapper[4964]: I1004 02:53:53.674710 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhk4" event={"ID":"f0ac9809-0e8c-4579-813a-9acacca251d5","Type":"ContainerDied","Data":"ded72e26705cda3cc1edd32107058cbbd402456893b7bdb1287ddb5c072285e8"} Oct 04 02:53:56 crc kubenswrapper[4964]: I1004 02:53:56.697182 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhk4" event={"ID":"f0ac9809-0e8c-4579-813a-9acacca251d5","Type":"ContainerStarted","Data":"5fbfdae6eb44332f45b6c7c968787c985530813b2c9634fabafe3099915d8648"} Oct 04 02:53:56 crc kubenswrapper[4964]: I1004 02:53:56.723653 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4nhk4" podStartSLOduration=2.244433933 podStartE2EDuration="6.723606671s" podCreationTimestamp="2025-10-04 02:53:50 +0000 UTC" firstStartedPulling="2025-10-04 02:53:51.656856653 +0000 UTC m=+811.553815321" lastFinishedPulling="2025-10-04 02:53:56.136029381 +0000 UTC m=+816.032988059" observedRunningTime="2025-10-04 02:53:56.721352682 +0000 UTC m=+816.618311330" watchObservedRunningTime="2025-10-04 02:53:56.723606671 +0000 UTC m=+816.620565349" Oct 04 02:54:00 crc kubenswrapper[4964]: I1004 02:54:00.548055 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:54:00 crc kubenswrapper[4964]: I1004 02:54:00.548337 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:54:00 crc kubenswrapper[4964]: I1004 02:54:00.633355 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.237312 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zwknv"] Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.238406 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.263832 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwknv"] Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.395739 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgxqq\" (UniqueName: \"kubernetes.io/projected/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-kube-api-access-mgxqq\") pod \"community-operators-zwknv\" (UID: \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\") " pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.395819 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-catalog-content\") pod \"community-operators-zwknv\" (UID: \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\") " pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.395840 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-utilities\") pod \"community-operators-zwknv\" (UID: \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\") " pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.496817 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-catalog-content\") pod \"community-operators-zwknv\" (UID: \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\") " pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.496863 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-utilities\") pod \"community-operators-zwknv\" (UID: \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\") " pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.496941 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgxqq\" (UniqueName: \"kubernetes.io/projected/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-kube-api-access-mgxqq\") pod \"community-operators-zwknv\" (UID: \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\") " pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.497584 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-catalog-content\") pod \"community-operators-zwknv\" (UID: \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\") " pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.497608 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-utilities\") pod \"community-operators-zwknv\" (UID: \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\") " pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.527233 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgxqq\" (UniqueName: \"kubernetes.io/projected/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-kube-api-access-mgxqq\") pod \"community-operators-zwknv\" (UID: \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\") " pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:01 crc kubenswrapper[4964]: I1004 02:54:01.562700 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:02 crc kubenswrapper[4964]: I1004 02:54:02.038566 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwknv"] Oct 04 02:54:02 crc kubenswrapper[4964]: I1004 02:54:02.750612 4964 generic.go:334] "Generic (PLEG): container finished" podID="72ebe95e-5523-4fa6-8a2a-c2c553f315b1" containerID="acd1fb7361d733b7aa02046d2b75696f89842e04659a38d8fbf1adf2d4563855" exitCode=0 Oct 04 02:54:02 crc kubenswrapper[4964]: I1004 02:54:02.750702 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwknv" event={"ID":"72ebe95e-5523-4fa6-8a2a-c2c553f315b1","Type":"ContainerDied","Data":"acd1fb7361d733b7aa02046d2b75696f89842e04659a38d8fbf1adf2d4563855"} Oct 04 02:54:02 crc kubenswrapper[4964]: I1004 02:54:02.750863 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwknv" event={"ID":"72ebe95e-5523-4fa6-8a2a-c2c553f315b1","Type":"ContainerStarted","Data":"e1b710a04b9a802b23ecd70842cf1e0e81f3a3415d7a9ace80774089375a2723"} Oct 04 02:54:03 crc kubenswrapper[4964]: I1004 02:54:03.759585 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwknv" event={"ID":"72ebe95e-5523-4fa6-8a2a-c2c553f315b1","Type":"ContainerStarted","Data":"e63c1ee3e8f285ccadede1dc095151d3e035347bfef096b438a385574d8542ad"} Oct 04 02:54:04 crc kubenswrapper[4964]: I1004 02:54:04.770166 4964 generic.go:334] "Generic (PLEG): container finished" podID="72ebe95e-5523-4fa6-8a2a-c2c553f315b1" containerID="e63c1ee3e8f285ccadede1dc095151d3e035347bfef096b438a385574d8542ad" exitCode=0 Oct 04 02:54:04 crc kubenswrapper[4964]: I1004 02:54:04.770209 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwknv" event={"ID":"72ebe95e-5523-4fa6-8a2a-c2c553f315b1","Type":"ContainerDied","Data":"e63c1ee3e8f285ccadede1dc095151d3e035347bfef096b438a385574d8542ad"} Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.374865 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.376277 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.379786 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-847kz" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.389359 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.394205 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.395267 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.399130 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-lthrl" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.404447 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.405355 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.408698 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-twznf" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.417734 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.422754 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.423955 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.427952 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xvrf5" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.435719 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.440438 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.458735 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsf4f\" (UniqueName: \"kubernetes.io/projected/a9d92a9d-3e4a-4945-b449-4aed29708295-kube-api-access-qsf4f\") pod \"barbican-operator-controller-manager-5f7c849b98-kcwbj\" (UID: \"a9d92a9d-3e4a-4945-b449-4aed29708295\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.458809 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdxfg\" (UniqueName: \"kubernetes.io/projected/35f77ec9-6ace-4f9c-ad47-6956e222902b-kube-api-access-fdxfg\") pod \"cinder-operator-controller-manager-55cd88dfc-q48nc\" (UID: \"35f77ec9-6ace-4f9c-ad47-6956e222902b\") " pod="openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.467995 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.468950 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.471988 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-h2gm2" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.479202 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.496155 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.497026 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.500770 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-7x98c" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.526755 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.528323 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.533310 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rfgmd" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.533432 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.537776 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.538721 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.539974 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-tj6cg" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.552951 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.558269 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.559412 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.561770 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-pq9ww" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.564097 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsqrp\" (UniqueName: \"kubernetes.io/projected/71f63b59-61c1-43ae-8726-bdc38806ee71-kube-api-access-vsqrp\") pod \"glance-operator-controller-manager-5568b5d68-mbdfq\" (UID: \"71f63b59-61c1-43ae-8726-bdc38806ee71\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.564252 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgj6t\" (UniqueName: \"kubernetes.io/projected/7a706485-5c1c-4f12-854b-779a385023fe-kube-api-access-zgj6t\") pod \"horizon-operator-controller-manager-54876c876f-8f8rc\" (UID: \"7a706485-5c1c-4f12-854b-779a385023fe\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.564324 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdxfg\" (UniqueName: \"kubernetes.io/projected/35f77ec9-6ace-4f9c-ad47-6956e222902b-kube-api-access-fdxfg\") pod \"cinder-operator-controller-manager-55cd88dfc-q48nc\" (UID: \"35f77ec9-6ace-4f9c-ad47-6956e222902b\") " pod="openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.564407 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmgx\" (UniqueName: \"kubernetes.io/projected/082a9114-18e1-40d0-829e-f2758614e49b-kube-api-access-dnmgx\") pod \"designate-operator-controller-manager-75dfd9b554-q7zh9\" (UID: \"082a9114-18e1-40d0-829e-f2758614e49b\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.564439 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9drjz\" (UniqueName: \"kubernetes.io/projected/5d80f2c1-5570-48f8-908f-d580f7c7ecc7-kube-api-access-9drjz\") pod \"heat-operator-controller-manager-8f58bc9db-qpmdv\" (UID: \"5d80f2c1-5570-48f8-908f-d580f7c7ecc7\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.564600 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsf4f\" (UniqueName: \"kubernetes.io/projected/a9d92a9d-3e4a-4945-b449-4aed29708295-kube-api-access-qsf4f\") pod \"barbican-operator-controller-manager-5f7c849b98-kcwbj\" (UID: \"a9d92a9d-3e4a-4945-b449-4aed29708295\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.615275 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.625257 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsf4f\" (UniqueName: \"kubernetes.io/projected/a9d92a9d-3e4a-4945-b449-4aed29708295-kube-api-access-qsf4f\") pod \"barbican-operator-controller-manager-5f7c849b98-kcwbj\" (UID: \"a9d92a9d-3e4a-4945-b449-4aed29708295\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.627104 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdxfg\" (UniqueName: \"kubernetes.io/projected/35f77ec9-6ace-4f9c-ad47-6956e222902b-kube-api-access-fdxfg\") pod \"cinder-operator-controller-manager-55cd88dfc-q48nc\" (UID: \"35f77ec9-6ace-4f9c-ad47-6956e222902b\") " pod="openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.629580 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.659356 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.671365 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgj6t\" (UniqueName: \"kubernetes.io/projected/7a706485-5c1c-4f12-854b-779a385023fe-kube-api-access-zgj6t\") pod \"horizon-operator-controller-manager-54876c876f-8f8rc\" (UID: \"7a706485-5c1c-4f12-854b-779a385023fe\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.671426 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56jlp\" (UniqueName: \"kubernetes.io/projected/1ac8bb69-05a0-4faa-a294-5243e4a2e21a-kube-api-access-56jlp\") pod \"keystone-operator-controller-manager-655d88ccb9-frmcj\" (UID: \"1ac8bb69-05a0-4faa-a294-5243e4a2e21a\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.671454 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmgx\" (UniqueName: \"kubernetes.io/projected/082a9114-18e1-40d0-829e-f2758614e49b-kube-api-access-dnmgx\") pod \"designate-operator-controller-manager-75dfd9b554-q7zh9\" (UID: \"082a9114-18e1-40d0-829e-f2758614e49b\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.671478 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9drjz\" (UniqueName: \"kubernetes.io/projected/5d80f2c1-5570-48f8-908f-d580f7c7ecc7-kube-api-access-9drjz\") pod \"heat-operator-controller-manager-8f58bc9db-qpmdv\" (UID: \"5d80f2c1-5570-48f8-908f-d580f7c7ecc7\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.671514 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/892610de-e4c4-4b99-a0ca-07fc0ad63df2-cert\") pod \"infra-operator-controller-manager-658588b8c9-n5sks\" (UID: \"892610de-e4c4-4b99-a0ca-07fc0ad63df2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.671548 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsqrp\" (UniqueName: \"kubernetes.io/projected/71f63b59-61c1-43ae-8726-bdc38806ee71-kube-api-access-vsqrp\") pod \"glance-operator-controller-manager-5568b5d68-mbdfq\" (UID: \"71f63b59-61c1-43ae-8726-bdc38806ee71\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.671567 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf4l9\" (UniqueName: \"kubernetes.io/projected/7272252c-6b3a-4680-9f59-37bc87154be8-kube-api-access-mf4l9\") pod \"ironic-operator-controller-manager-699b87f775-99sf8\" (UID: \"7272252c-6b3a-4680-9f59-37bc87154be8\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.671589 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk4pn\" (UniqueName: \"kubernetes.io/projected/892610de-e4c4-4b99-a0ca-07fc0ad63df2-kube-api-access-dk4pn\") pod \"infra-operator-controller-manager-658588b8c9-n5sks\" (UID: \"892610de-e4c4-4b99-a0ca-07fc0ad63df2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.691550 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgj6t\" (UniqueName: \"kubernetes.io/projected/7a706485-5c1c-4f12-854b-779a385023fe-kube-api-access-zgj6t\") pod \"horizon-operator-controller-manager-54876c876f-8f8rc\" (UID: \"7a706485-5c1c-4f12-854b-779a385023fe\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.693656 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.694675 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.695448 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.700389 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2ppqq" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.700591 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.700711 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.701810 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.703078 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9drjz\" (UniqueName: \"kubernetes.io/projected/5d80f2c1-5570-48f8-908f-d580f7c7ecc7-kube-api-access-9drjz\") pod \"heat-operator-controller-manager-8f58bc9db-qpmdv\" (UID: \"5d80f2c1-5570-48f8-908f-d580f7c7ecc7\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.708481 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.709322 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-6bbmh" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.711300 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmgx\" (UniqueName: \"kubernetes.io/projected/082a9114-18e1-40d0-829e-f2758614e49b-kube-api-access-dnmgx\") pod \"designate-operator-controller-manager-75dfd9b554-q7zh9\" (UID: \"082a9114-18e1-40d0-829e-f2758614e49b\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.712676 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsqrp\" (UniqueName: \"kubernetes.io/projected/71f63b59-61c1-43ae-8726-bdc38806ee71-kube-api-access-vsqrp\") pod \"glance-operator-controller-manager-5568b5d68-mbdfq\" (UID: \"71f63b59-61c1-43ae-8726-bdc38806ee71\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.720966 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.722132 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.724740 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.725191 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.726870 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.727018 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-jffvh" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.731098 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.732092 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-l2fbz" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.739310 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.747392 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.759851 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.765883 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.767018 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.769883 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-kmdsn" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.772540 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf4l9\" (UniqueName: \"kubernetes.io/projected/7272252c-6b3a-4680-9f59-37bc87154be8-kube-api-access-mf4l9\") pod \"ironic-operator-controller-manager-699b87f775-99sf8\" (UID: \"7272252c-6b3a-4680-9f59-37bc87154be8\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.772602 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk4pn\" (UniqueName: \"kubernetes.io/projected/892610de-e4c4-4b99-a0ca-07fc0ad63df2-kube-api-access-dk4pn\") pod \"infra-operator-controller-manager-658588b8c9-n5sks\" (UID: \"892610de-e4c4-4b99-a0ca-07fc0ad63df2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.772685 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56jlp\" (UniqueName: \"kubernetes.io/projected/1ac8bb69-05a0-4faa-a294-5243e4a2e21a-kube-api-access-56jlp\") pod \"keystone-operator-controller-manager-655d88ccb9-frmcj\" (UID: \"1ac8bb69-05a0-4faa-a294-5243e4a2e21a\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.772735 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7skmj\" (UniqueName: \"kubernetes.io/projected/aab9e95e-6af6-483a-9cef-96a4accd24f9-kube-api-access-7skmj\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh\" (UID: \"aab9e95e-6af6-483a-9cef-96a4accd24f9\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.772775 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/892610de-e4c4-4b99-a0ca-07fc0ad63df2-cert\") pod \"infra-operator-controller-manager-658588b8c9-n5sks\" (UID: \"892610de-e4c4-4b99-a0ca-07fc0ad63df2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.772799 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k79nn\" (UniqueName: \"kubernetes.io/projected/be3f98e4-03d2-46bb-b7fe-bc050255934c-kube-api-access-k79nn\") pod \"manila-operator-controller-manager-65d89cfd9f-sz72v\" (UID: \"be3f98e4-03d2-46bb-b7fe-bc050255934c\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" Oct 04 02:54:05 crc kubenswrapper[4964]: E1004 02:54:05.773147 4964 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 04 02:54:05 crc kubenswrapper[4964]: E1004 02:54:05.773188 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/892610de-e4c4-4b99-a0ca-07fc0ad63df2-cert podName:892610de-e4c4-4b99-a0ca-07fc0ad63df2 nodeName:}" failed. No retries permitted until 2025-10-04 02:54:06.273171797 +0000 UTC m=+826.170130435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/892610de-e4c4-4b99-a0ca-07fc0ad63df2-cert") pod "infra-operator-controller-manager-658588b8c9-n5sks" (UID: "892610de-e4c4-4b99-a0ca-07fc0ad63df2") : secret "infra-operator-webhook-server-cert" not found Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.777451 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.794267 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56jlp\" (UniqueName: \"kubernetes.io/projected/1ac8bb69-05a0-4faa-a294-5243e4a2e21a-kube-api-access-56jlp\") pod \"keystone-operator-controller-manager-655d88ccb9-frmcj\" (UID: \"1ac8bb69-05a0-4faa-a294-5243e4a2e21a\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.796207 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf4l9\" (UniqueName: \"kubernetes.io/projected/7272252c-6b3a-4680-9f59-37bc87154be8-kube-api-access-mf4l9\") pod \"ironic-operator-controller-manager-699b87f775-99sf8\" (UID: \"7272252c-6b3a-4680-9f59-37bc87154be8\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.800082 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.808815 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.809840 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.812375 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-87tgf" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.815255 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.817747 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.818916 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.820030 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-45zmm" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.822991 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.829669 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.830677 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.831409 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.831727 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.834378 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.836260 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-qfhvd" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.836444 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.836616 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-whpd5" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.839387 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk4pn\" (UniqueName: \"kubernetes.io/projected/892610de-e4c4-4b99-a0ca-07fc0ad63df2-kube-api-access-dk4pn\") pod \"infra-operator-controller-manager-658588b8c9-n5sks\" (UID: \"892610de-e4c4-4b99-a0ca-07fc0ad63df2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.841407 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwknv" event={"ID":"72ebe95e-5523-4fa6-8a2a-c2c553f315b1","Type":"ContainerStarted","Data":"06bcb6f4f12ef71723138814d4c15774e1ffe64b1c8461b644c26b78fc22fb15"} Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.849536 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.865508 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.876016 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rspmn\" (UniqueName: \"kubernetes.io/projected/e756e118-8e7e-4e1f-827d-cef4acdbb848-kube-api-access-rspmn\") pod \"nova-operator-controller-manager-7c7fc454ff-nkxrt\" (UID: \"e756e118-8e7e-4e1f-827d-cef4acdbb848\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.876080 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgdbx\" (UniqueName: \"kubernetes.io/projected/b4751629-75d2-4c2a-afb5-a7b7915cb644-kube-api-access-hgdbx\") pod \"ovn-operator-controller-manager-579449c7d5-qgrnf\" (UID: \"b4751629-75d2-4c2a-afb5-a7b7915cb644\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.876115 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw6z2\" (UniqueName: \"kubernetes.io/projected/4c3ecd12-c2c1-48ba-b75d-e1df6fcb7a4a-kube-api-access-gw6z2\") pod \"neutron-operator-controller-manager-8d984cc4d-85ckj\" (UID: \"4c3ecd12-c2c1-48ba-b75d-e1df6fcb7a4a\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.876143 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7skmj\" (UniqueName: \"kubernetes.io/projected/aab9e95e-6af6-483a-9cef-96a4accd24f9-kube-api-access-7skmj\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh\" (UID: \"aab9e95e-6af6-483a-9cef-96a4accd24f9\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.876163 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d946p\" (UniqueName: \"kubernetes.io/projected/2611c21b-338e-4dc0-b977-e15067937730-kube-api-access-d946p\") pod \"octavia-operator-controller-manager-7468f855d8-strsv\" (UID: \"2611c21b-338e-4dc0-b977-e15067937730\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.876201 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k79nn\" (UniqueName: \"kubernetes.io/projected/be3f98e4-03d2-46bb-b7fe-bc050255934c-kube-api-access-k79nn\") pod \"manila-operator-controller-manager-65d89cfd9f-sz72v\" (UID: \"be3f98e4-03d2-46bb-b7fe-bc050255934c\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.878195 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.891974 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.900326 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7skmj\" (UniqueName: \"kubernetes.io/projected/aab9e95e-6af6-483a-9cef-96a4accd24f9-kube-api-access-7skmj\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh\" (UID: \"aab9e95e-6af6-483a-9cef-96a4accd24f9\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.906271 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k79nn\" (UniqueName: \"kubernetes.io/projected/be3f98e4-03d2-46bb-b7fe-bc050255934c-kube-api-access-k79nn\") pod \"manila-operator-controller-manager-65d89cfd9f-sz72v\" (UID: \"be3f98e4-03d2-46bb-b7fe-bc050255934c\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.921711 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.922887 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.925466 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hms92" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.929894 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.964179 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.965793 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.968476 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nxbz6" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.977604 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgdbx\" (UniqueName: \"kubernetes.io/projected/b4751629-75d2-4c2a-afb5-a7b7915cb644-kube-api-access-hgdbx\") pod \"ovn-operator-controller-manager-579449c7d5-qgrnf\" (UID: \"b4751629-75d2-4c2a-afb5-a7b7915cb644\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.977675 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct6t8\" (UniqueName: \"kubernetes.io/projected/3fa08171-beb2-42d4-a751-fe46eb179a70-kube-api-access-ct6t8\") pod \"placement-operator-controller-manager-54689d9f88-wlzgd\" (UID: \"3fa08171-beb2-42d4-a751-fe46eb179a70\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.977703 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw6z2\" (UniqueName: \"kubernetes.io/projected/4c3ecd12-c2c1-48ba-b75d-e1df6fcb7a4a-kube-api-access-gw6z2\") pod \"neutron-operator-controller-manager-8d984cc4d-85ckj\" (UID: \"4c3ecd12-c2c1-48ba-b75d-e1df6fcb7a4a\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.977729 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d946p\" (UniqueName: \"kubernetes.io/projected/2611c21b-338e-4dc0-b977-e15067937730-kube-api-access-d946p\") pod \"octavia-operator-controller-manager-7468f855d8-strsv\" (UID: \"2611c21b-338e-4dc0-b977-e15067937730\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.977782 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/762eb95d-bd98-4d86-8fc8-404234c0a13e-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm\" (UID: \"762eb95d-bd98-4d86-8fc8-404234c0a13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.977819 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns9j5\" (UniqueName: \"kubernetes.io/projected/c1244d8a-d20c-4318-9dfd-3617e35e54e9-kube-api-access-ns9j5\") pod \"swift-operator-controller-manager-6859f9b676-dzqh8\" (UID: \"c1244d8a-d20c-4318-9dfd-3617e35e54e9\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.977854 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9ffd\" (UniqueName: \"kubernetes.io/projected/762eb95d-bd98-4d86-8fc8-404234c0a13e-kube-api-access-l9ffd\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm\" (UID: \"762eb95d-bd98-4d86-8fc8-404234c0a13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.977873 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rspmn\" (UniqueName: \"kubernetes.io/projected/e756e118-8e7e-4e1f-827d-cef4acdbb848-kube-api-access-rspmn\") pod \"nova-operator-controller-manager-7c7fc454ff-nkxrt\" (UID: \"e756e118-8e7e-4e1f-827d-cef4acdbb848\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.991248 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9"] Oct 04 02:54:05 crc kubenswrapper[4964]: I1004 02:54:05.999255 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d946p\" (UniqueName: \"kubernetes.io/projected/2611c21b-338e-4dc0-b977-e15067937730-kube-api-access-d946p\") pod \"octavia-operator-controller-manager-7468f855d8-strsv\" (UID: \"2611c21b-338e-4dc0-b977-e15067937730\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.054412 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.062015 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.068621 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zwknv" podStartSLOduration=2.628976157 podStartE2EDuration="5.068596895s" podCreationTimestamp="2025-10-04 02:54:01 +0000 UTC" firstStartedPulling="2025-10-04 02:54:02.752649134 +0000 UTC m=+822.649607812" lastFinishedPulling="2025-10-04 02:54:05.192269902 +0000 UTC m=+825.089228550" observedRunningTime="2025-10-04 02:54:05.916098206 +0000 UTC m=+825.813056834" watchObservedRunningTime="2025-10-04 02:54:06.068596895 +0000 UTC m=+825.965555533" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.078061 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgdbx\" (UniqueName: \"kubernetes.io/projected/b4751629-75d2-4c2a-afb5-a7b7915cb644-kube-api-access-hgdbx\") pod \"ovn-operator-controller-manager-579449c7d5-qgrnf\" (UID: \"b4751629-75d2-4c2a-afb5-a7b7915cb644\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.079737 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wpbq\" (UniqueName: \"kubernetes.io/projected/36a7b704-074e-4b3c-a459-e55607c9f604-kube-api-access-6wpbq\") pod \"test-operator-controller-manager-5cd5cb47d7-vrfm9\" (UID: \"36a7b704-074e-4b3c-a459-e55607c9f604\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.079798 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/762eb95d-bd98-4d86-8fc8-404234c0a13e-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm\" (UID: \"762eb95d-bd98-4d86-8fc8-404234c0a13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.079848 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns9j5\" (UniqueName: \"kubernetes.io/projected/c1244d8a-d20c-4318-9dfd-3617e35e54e9-kube-api-access-ns9j5\") pod \"swift-operator-controller-manager-6859f9b676-dzqh8\" (UID: \"c1244d8a-d20c-4318-9dfd-3617e35e54e9\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.079984 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9ffd\" (UniqueName: \"kubernetes.io/projected/762eb95d-bd98-4d86-8fc8-404234c0a13e-kube-api-access-l9ffd\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm\" (UID: \"762eb95d-bd98-4d86-8fc8-404234c0a13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.080416 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxp7c\" (UniqueName: \"kubernetes.io/projected/5ba7848e-5d4e-4de0-a0de-2a8bcd534c90-kube-api-access-pxp7c\") pod \"telemetry-operator-controller-manager-5d4d74dd89-hkmlt\" (UID: \"5ba7848e-5d4e-4de0-a0de-2a8bcd534c90\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.080520 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct6t8\" (UniqueName: \"kubernetes.io/projected/3fa08171-beb2-42d4-a751-fe46eb179a70-kube-api-access-ct6t8\") pod \"placement-operator-controller-manager-54689d9f88-wlzgd\" (UID: \"3fa08171-beb2-42d4-a751-fe46eb179a70\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd" Oct 04 02:54:06 crc kubenswrapper[4964]: E1004 02:54:06.080543 4964 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 04 02:54:06 crc kubenswrapper[4964]: E1004 02:54:06.080620 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/762eb95d-bd98-4d86-8fc8-404234c0a13e-cert podName:762eb95d-bd98-4d86-8fc8-404234c0a13e nodeName:}" failed. No retries permitted until 2025-10-04 02:54:06.58059039 +0000 UTC m=+826.477549028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/762eb95d-bd98-4d86-8fc8-404234c0a13e-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" (UID: "762eb95d-bd98-4d86-8fc8-404234c0a13e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.085308 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw6z2\" (UniqueName: \"kubernetes.io/projected/4c3ecd12-c2c1-48ba-b75d-e1df6fcb7a4a-kube-api-access-gw6z2\") pod \"neutron-operator-controller-manager-8d984cc4d-85ckj\" (UID: \"4c3ecd12-c2c1-48ba-b75d-e1df6fcb7a4a\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.105728 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rspmn\" (UniqueName: \"kubernetes.io/projected/e756e118-8e7e-4e1f-827d-cef4acdbb848-kube-api-access-rspmn\") pod \"nova-operator-controller-manager-7c7fc454ff-nkxrt\" (UID: \"e756e118-8e7e-4e1f-827d-cef4acdbb848\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.111643 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp"] Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.120313 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp"] Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.120406 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.122695 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-kqrd6" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.129649 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf"] Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.130858 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.135189 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct6t8\" (UniqueName: \"kubernetes.io/projected/3fa08171-beb2-42d4-a751-fe46eb179a70-kube-api-access-ct6t8\") pod \"placement-operator-controller-manager-54689d9f88-wlzgd\" (UID: \"3fa08171-beb2-42d4-a751-fe46eb179a70\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.139797 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf"] Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.145212 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.145256 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-j2jw4" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.145656 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.146617 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns9j5\" (UniqueName: \"kubernetes.io/projected/c1244d8a-d20c-4318-9dfd-3617e35e54e9-kube-api-access-ns9j5\") pod \"swift-operator-controller-manager-6859f9b676-dzqh8\" (UID: \"c1244d8a-d20c-4318-9dfd-3617e35e54e9\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.161352 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9ffd\" (UniqueName: \"kubernetes.io/projected/762eb95d-bd98-4d86-8fc8-404234c0a13e-kube-api-access-l9ffd\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm\" (UID: \"762eb95d-bd98-4d86-8fc8-404234c0a13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.168056 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.178945 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm"] Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.179829 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.181872 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-jm2cv" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.182623 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdm54\" (UniqueName: \"kubernetes.io/projected/e317dfbb-55c5-49ee-8e16-5bc0532e2dfb-kube-api-access-jdm54\") pod \"watcher-operator-controller-manager-6cbc6dd547-frmfp\" (UID: \"e317dfbb-55c5-49ee-8e16-5bc0532e2dfb\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.182694 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wpbq\" (UniqueName: \"kubernetes.io/projected/36a7b704-074e-4b3c-a459-e55607c9f604-kube-api-access-6wpbq\") pod \"test-operator-controller-manager-5cd5cb47d7-vrfm9\" (UID: \"36a7b704-074e-4b3c-a459-e55607c9f604\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.182738 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wprw9\" (UniqueName: \"kubernetes.io/projected/4274bd25-ba07-4036-80b8-86561c1a6f64-kube-api-access-wprw9\") pod \"openstack-operator-controller-manager-545dfb464d-g5kjf\" (UID: \"4274bd25-ba07-4036-80b8-86561c1a6f64\") " pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.182787 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxp7c\" (UniqueName: \"kubernetes.io/projected/5ba7848e-5d4e-4de0-a0de-2a8bcd534c90-kube-api-access-pxp7c\") pod \"telemetry-operator-controller-manager-5d4d74dd89-hkmlt\" (UID: \"5ba7848e-5d4e-4de0-a0de-2a8bcd534c90\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.182819 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4274bd25-ba07-4036-80b8-86561c1a6f64-cert\") pod \"openstack-operator-controller-manager-545dfb464d-g5kjf\" (UID: \"4274bd25-ba07-4036-80b8-86561c1a6f64\") " pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.183722 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm"] Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.199238 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxp7c\" (UniqueName: \"kubernetes.io/projected/5ba7848e-5d4e-4de0-a0de-2a8bcd534c90-kube-api-access-pxp7c\") pod \"telemetry-operator-controller-manager-5d4d74dd89-hkmlt\" (UID: \"5ba7848e-5d4e-4de0-a0de-2a8bcd534c90\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.203065 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wpbq\" (UniqueName: \"kubernetes.io/projected/36a7b704-074e-4b3c-a459-e55607c9f604-kube-api-access-6wpbq\") pod \"test-operator-controller-manager-5cd5cb47d7-vrfm9\" (UID: \"36a7b704-074e-4b3c-a459-e55607c9f604\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.217353 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.242905 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.270942 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.282256 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.283344 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4274bd25-ba07-4036-80b8-86561c1a6f64-cert\") pod \"openstack-operator-controller-manager-545dfb464d-g5kjf\" (UID: \"4274bd25-ba07-4036-80b8-86561c1a6f64\") " pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.283465 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6js72\" (UniqueName: \"kubernetes.io/projected/4b675799-2c4a-4167-bd37-0de27bc8861d-kube-api-access-6js72\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm\" (UID: \"4b675799-2c4a-4167-bd37-0de27bc8861d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.283517 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdm54\" (UniqueName: \"kubernetes.io/projected/e317dfbb-55c5-49ee-8e16-5bc0532e2dfb-kube-api-access-jdm54\") pod \"watcher-operator-controller-manager-6cbc6dd547-frmfp\" (UID: \"e317dfbb-55c5-49ee-8e16-5bc0532e2dfb\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" Oct 04 02:54:06 crc kubenswrapper[4964]: E1004 02:54:06.283527 4964 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.283572 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/892610de-e4c4-4b99-a0ca-07fc0ad63df2-cert\") pod \"infra-operator-controller-manager-658588b8c9-n5sks\" (UID: \"892610de-e4c4-4b99-a0ca-07fc0ad63df2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" Oct 04 02:54:06 crc kubenswrapper[4964]: E1004 02:54:06.283596 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4274bd25-ba07-4036-80b8-86561c1a6f64-cert podName:4274bd25-ba07-4036-80b8-86561c1a6f64 nodeName:}" failed. No retries permitted until 2025-10-04 02:54:06.783576618 +0000 UTC m=+826.680535256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4274bd25-ba07-4036-80b8-86561c1a6f64-cert") pod "openstack-operator-controller-manager-545dfb464d-g5kjf" (UID: "4274bd25-ba07-4036-80b8-86561c1a6f64") : secret "webhook-server-cert" not found Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.283652 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wprw9\" (UniqueName: \"kubernetes.io/projected/4274bd25-ba07-4036-80b8-86561c1a6f64-kube-api-access-wprw9\") pod \"openstack-operator-controller-manager-545dfb464d-g5kjf\" (UID: \"4274bd25-ba07-4036-80b8-86561c1a6f64\") " pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.305108 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.306475 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wprw9\" (UniqueName: \"kubernetes.io/projected/4274bd25-ba07-4036-80b8-86561c1a6f64-kube-api-access-wprw9\") pod \"openstack-operator-controller-manager-545dfb464d-g5kjf\" (UID: \"4274bd25-ba07-4036-80b8-86561c1a6f64\") " pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.307062 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/892610de-e4c4-4b99-a0ca-07fc0ad63df2-cert\") pod \"infra-operator-controller-manager-658588b8c9-n5sks\" (UID: \"892610de-e4c4-4b99-a0ca-07fc0ad63df2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.318890 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdm54\" (UniqueName: \"kubernetes.io/projected/e317dfbb-55c5-49ee-8e16-5bc0532e2dfb-kube-api-access-jdm54\") pod \"watcher-operator-controller-manager-6cbc6dd547-frmfp\" (UID: \"e317dfbb-55c5-49ee-8e16-5bc0532e2dfb\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.393415 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6js72\" (UniqueName: \"kubernetes.io/projected/4b675799-2c4a-4167-bd37-0de27bc8861d-kube-api-access-6js72\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm\" (UID: \"4b675799-2c4a-4167-bd37-0de27bc8861d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.394098 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.412226 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6js72\" (UniqueName: \"kubernetes.io/projected/4b675799-2c4a-4167-bd37-0de27bc8861d-kube-api-access-6js72\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm\" (UID: \"4b675799-2c4a-4167-bd37-0de27bc8861d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.455356 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.468097 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.499749 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.595574 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/762eb95d-bd98-4d86-8fc8-404234c0a13e-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm\" (UID: \"762eb95d-bd98-4d86-8fc8-404234c0a13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.604375 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/762eb95d-bd98-4d86-8fc8-404234c0a13e-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm\" (UID: \"762eb95d-bd98-4d86-8fc8-404234c0a13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.769908 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc"] Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.792536 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj"] Oct 04 02:54:06 crc kubenswrapper[4964]: W1004 02:54:06.795233 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9d92a9d_3e4a_4945_b449_4aed29708295.slice/crio-c13eb0dd9e75bc9382fd94d8bf049008d04b88a3501118b6416e858a9f50b17c WatchSource:0}: Error finding container c13eb0dd9e75bc9382fd94d8bf049008d04b88a3501118b6416e858a9f50b17c: Status 404 returned error can't find the container with id c13eb0dd9e75bc9382fd94d8bf049008d04b88a3501118b6416e858a9f50b17c Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.799924 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4274bd25-ba07-4036-80b8-86561c1a6f64-cert\") pod \"openstack-operator-controller-manager-545dfb464d-g5kjf\" (UID: \"4274bd25-ba07-4036-80b8-86561c1a6f64\") " pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" Oct 04 02:54:06 crc kubenswrapper[4964]: E1004 02:54:06.800216 4964 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 04 02:54:06 crc kubenswrapper[4964]: E1004 02:54:06.800263 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4274bd25-ba07-4036-80b8-86561c1a6f64-cert podName:4274bd25-ba07-4036-80b8-86561c1a6f64 nodeName:}" failed. No retries permitted until 2025-10-04 02:54:07.800246893 +0000 UTC m=+827.697205531 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4274bd25-ba07-4036-80b8-86561c1a6f64-cert") pod "openstack-operator-controller-manager-545dfb464d-g5kjf" (UID: "4274bd25-ba07-4036-80b8-86561c1a6f64") : secret "webhook-server-cert" not found Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.852458 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.855527 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj" event={"ID":"a9d92a9d-3e4a-4945-b449-4aed29708295","Type":"ContainerStarted","Data":"c13eb0dd9e75bc9382fd94d8bf049008d04b88a3501118b6416e858a9f50b17c"} Oct 04 02:54:06 crc kubenswrapper[4964]: I1004 02:54:06.855562 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc" event={"ID":"35f77ec9-6ace-4f9c-ad47-6956e222902b","Type":"ContainerStarted","Data":"20fa416755f056964d3538f66dd21385cb2dc65d96c9c4a099aaae983dd07bb6"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.036555 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.072589 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.111735 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.137643 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.164562 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.171893 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.177674 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.182852 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.184797 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.190234 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.191691 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.195479 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt"] Oct 04 02:54:07 crc kubenswrapper[4964]: W1004 02:54:07.198558 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fa08171_beb2_42d4_a751_fe46eb179a70.slice/crio-1ec228bd4ec7d7c64b3c5e9af0c0a2ee9794d29d7241394f01a1f8a495d59590 WatchSource:0}: Error finding container 1ec228bd4ec7d7c64b3c5e9af0c0a2ee9794d29d7241394f01a1f8a495d59590: Status 404 returned error can't find the container with id 1ec228bd4ec7d7c64b3c5e9af0c0a2ee9794d29d7241394f01a1f8a495d59590 Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.198705 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8"] Oct 04 02:54:07 crc kubenswrapper[4964]: W1004 02:54:07.199180 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2611c21b_338e_4dc0_b977_e15067937730.slice/crio-a5a6f0517cec327b7c4fd513d21374ef135fd92d64e125a29001bfc502d1f3f3 WatchSource:0}: Error finding container a5a6f0517cec327b7c4fd513d21374ef135fd92d64e125a29001bfc502d1f3f3: Status 404 returned error can't find the container with id a5a6f0517cec327b7c4fd513d21374ef135fd92d64e125a29001bfc502d1f3f3 Oct 04 02:54:07 crc kubenswrapper[4964]: W1004 02:54:07.202138 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaab9e95e_6af6_483a_9cef_96a4accd24f9.slice/crio-5b6ebc54c667aafe7454028a576a1aff725a4e2ffcf1a2fac8c8e2deae0be624 WatchSource:0}: Error finding container 5b6ebc54c667aafe7454028a576a1aff725a4e2ffcf1a2fac8c8e2deae0be624: Status 404 returned error can't find the container with id 5b6ebc54c667aafe7454028a576a1aff725a4e2ffcf1a2fac8c8e2deae0be624 Oct 04 02:54:07 crc kubenswrapper[4964]: W1004 02:54:07.205854 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36a7b704_074e_4b3c_a459_e55607c9f604.slice/crio-14672d4fbfdd5dd078e6dbd54df66938298e0d1a3b8dd050c72c08febb919220 WatchSource:0}: Error finding container 14672d4fbfdd5dd078e6dbd54df66938298e0d1a3b8dd050c72c08febb919220: Status 404 returned error can't find the container with id 14672d4fbfdd5dd078e6dbd54df66938298e0d1a3b8dd050c72c08febb919220 Oct 04 02:54:07 crc kubenswrapper[4964]: W1004 02:54:07.207680 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode756e118_8e7e_4e1f_827d_cef4acdbb848.slice/crio-38524f44e1f905e0c38dcf9cc4d718b14f03c19630fadf0179e51b0fb444e3b8 WatchSource:0}: Error finding container 38524f44e1f905e0c38dcf9cc4d718b14f03c19630fadf0179e51b0fb444e3b8: Status 404 returned error can't find the container with id 38524f44e1f905e0c38dcf9cc4d718b14f03c19630fadf0179e51b0fb444e3b8 Oct 04 02:54:07 crc kubenswrapper[4964]: W1004 02:54:07.212555 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1244d8a_d20c_4318_9dfd_3617e35e54e9.slice/crio-968747b1c796fc3875b8790476945369ff3abd1955878ce3a5f0fa8e87574f7c WatchSource:0}: Error finding container 968747b1c796fc3875b8790476945369ff3abd1955878ce3a5f0fa8e87574f7c: Status 404 returned error can't find the container with id 968747b1c796fc3875b8790476945369ff3abd1955878ce3a5f0fa8e87574f7c Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.213722 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rspmn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7c7fc454ff-nkxrt_openstack-operators(e756e118-8e7e-4e1f-827d-cef4acdbb848): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.213979 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6wpbq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-vrfm9_openstack-operators(36a7b704-074e-4b3c-a459-e55607c9f604): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.215981 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ns9j5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-dzqh8_openstack-operators(c1244d8a-d20c-4318-9dfd-3617e35e54e9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.403665 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.436294 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.443031 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.453940 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp"] Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.461270 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks"] Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.481489 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" podUID="36a7b704-074e-4b3c-a459-e55607c9f604" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.481888 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" podUID="c1244d8a-d20c-4318-9dfd-3617e35e54e9" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.481947 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" podUID="e756e118-8e7e-4e1f-827d-cef4acdbb848" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.485720 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf"] Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.485957 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k79nn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-65d89cfd9f-sz72v_openstack-operators(be3f98e4-03d2-46bb-b7fe-bc050255934c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.486373 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6js72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm_openstack-operators(4b675799-2c4a-4167-bd37-0de27bc8861d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.486385 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dk4pn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-658588b8c9-n5sks_openstack-operators(892610de-e4c4-4b99-a0ca-07fc0ad63df2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.487234 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jdm54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6cbc6dd547-frmfp_openstack-operators(e317dfbb-55c5-49ee-8e16-5bc0532e2dfb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.487679 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm" podUID="4b675799-2c4a-4167-bd37-0de27bc8861d" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.499033 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:f923b76c1dd8fde02a5faf8a0a433cfacfb7b743f371de64a12e30d6efcde254,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hgdbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-579449c7d5-qgrnf_openstack-operators(b4751629-75d2-4c2a-afb5-a7b7915cb644): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.626342 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm"] Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.673450 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" podUID="be3f98e4-03d2-46bb-b7fe-bc050255934c" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.673585 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" podUID="b4751629-75d2-4c2a-afb5-a7b7915cb644" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.680738 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" podUID="892610de-e4c4-4b99-a0ca-07fc0ad63df2" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.681792 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" podUID="e317dfbb-55c5-49ee-8e16-5bc0532e2dfb" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.837422 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4274bd25-ba07-4036-80b8-86561c1a6f64-cert\") pod \"openstack-operator-controller-manager-545dfb464d-g5kjf\" (UID: \"4274bd25-ba07-4036-80b8-86561c1a6f64\") " pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.846014 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4274bd25-ba07-4036-80b8-86561c1a6f64-cert\") pod \"openstack-operator-controller-manager-545dfb464d-g5kjf\" (UID: \"4274bd25-ba07-4036-80b8-86561c1a6f64\") " pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.857630 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv" event={"ID":"2611c21b-338e-4dc0-b977-e15067937730","Type":"ContainerStarted","Data":"a5a6f0517cec327b7c4fd513d21374ef135fd92d64e125a29001bfc502d1f3f3"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.860400 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" event={"ID":"e756e118-8e7e-4e1f-827d-cef4acdbb848","Type":"ContainerStarted","Data":"af080e7e8703d307e80fd1385b4c3f0a5500a0a3622eb7e94652befdba1913c2"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.860425 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" event={"ID":"e756e118-8e7e-4e1f-827d-cef4acdbb848","Type":"ContainerStarted","Data":"38524f44e1f905e0c38dcf9cc4d718b14f03c19630fadf0179e51b0fb444e3b8"} Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.862259 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" podUID="e756e118-8e7e-4e1f-827d-cef4acdbb848" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.871100 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" event={"ID":"be3f98e4-03d2-46bb-b7fe-bc050255934c","Type":"ContainerStarted","Data":"2ad26ce1fb049eadcceb1387cee49626f115ed47fc60374c287f75acf9bf0567"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.871127 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" event={"ID":"be3f98e4-03d2-46bb-b7fe-bc050255934c","Type":"ContainerStarted","Data":"d6ef6de4856427b909411ba9253b31043c4911e686d222017da1c9c7615fe536"} Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.872233 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757\\\"\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" podUID="be3f98e4-03d2-46bb-b7fe-bc050255934c" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.874286 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" event={"ID":"e317dfbb-55c5-49ee-8e16-5bc0532e2dfb","Type":"ContainerStarted","Data":"6175a6f6377eee47754ac59efa42d608b3adb611f0421963bac310b19bc1188b"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.874758 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" event={"ID":"e317dfbb-55c5-49ee-8e16-5bc0532e2dfb","Type":"ContainerStarted","Data":"74ffa6260b3a7d368148bc2ed68c8d6d142fa9e0e2a03d770fae2c949dc9726d"} Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.875464 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" podUID="e317dfbb-55c5-49ee-8e16-5bc0532e2dfb" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.900763 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj" event={"ID":"1ac8bb69-05a0-4faa-a294-5243e4a2e21a","Type":"ContainerStarted","Data":"7929def2af0ab498ccbbe74d551080dbd7ccf0636a1168d67955f317e74eb88b"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.923163 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm" event={"ID":"4b675799-2c4a-4167-bd37-0de27bc8861d","Type":"ContainerStarted","Data":"bcc7cc724ec2da54d0e7c869479661245e6e99742b4f8ac1be33948d9d969b81"} Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.937293 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm" podUID="4b675799-2c4a-4167-bd37-0de27bc8861d" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.941008 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt" event={"ID":"5ba7848e-5d4e-4de0-a0de-2a8bcd534c90","Type":"ContainerStarted","Data":"6436dbc4038e91d04763468d60890abbe4e7064ff1002ae4c4b9c46b1c9f7b5b"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.944083 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd" event={"ID":"3fa08171-beb2-42d4-a751-fe46eb179a70","Type":"ContainerStarted","Data":"1ec228bd4ec7d7c64b3c5e9af0c0a2ee9794d29d7241394f01a1f8a495d59590"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.945297 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv" event={"ID":"5d80f2c1-5570-48f8-908f-d580f7c7ecc7","Type":"ContainerStarted","Data":"f02cad85cd2c3df9e1dcc2f2111a6544c55d63df61c93ae6f6fcb34a9085d244"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.947260 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" event={"ID":"36a7b704-074e-4b3c-a459-e55607c9f604","Type":"ContainerStarted","Data":"08de967755c7322ca63a60c99c633e98e558f09cd1c8c03fb56c4b7cdfbe4134"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.947282 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" event={"ID":"36a7b704-074e-4b3c-a459-e55607c9f604","Type":"ContainerStarted","Data":"14672d4fbfdd5dd078e6dbd54df66938298e0d1a3b8dd050c72c08febb919220"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.948088 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj" event={"ID":"4c3ecd12-c2c1-48ba-b75d-e1df6fcb7a4a","Type":"ContainerStarted","Data":"8749e5ab109fec8c934f95a237b0b2e946702e29e56f0e0e6762eb856cc5617c"} Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.948530 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" podUID="36a7b704-074e-4b3c-a459-e55607c9f604" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.950129 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc" event={"ID":"7a706485-5c1c-4f12-854b-779a385023fe","Type":"ContainerStarted","Data":"206f2a4ce00306674dc3da63b17248f63bb71aca5b46aad9377f858be980fffe"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.957312 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" event={"ID":"762eb95d-bd98-4d86-8fc8-404234c0a13e","Type":"ContainerStarted","Data":"6d3aec961f3566408528b83e7c6c283bb2587f0438b189494158fbdb25d3dfd7"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.959928 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" event={"ID":"892610de-e4c4-4b99-a0ca-07fc0ad63df2","Type":"ContainerStarted","Data":"17ec9dcced3bcbe01f67fc56260da46ed4b96528ae7c7216c01f26022ff92980"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.959954 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" event={"ID":"892610de-e4c4-4b99-a0ca-07fc0ad63df2","Type":"ContainerStarted","Data":"c6d78a8c00486aa5f1bcb5198725a965ee6c81ec889bbda4d8128a6e3c4d17b2"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.960716 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq" event={"ID":"71f63b59-61c1-43ae-8726-bdc38806ee71","Type":"ContainerStarted","Data":"b2d1b9c272c82f9d33239cc449e1a5237b367308dbd6b47a5d4a308e2f9c10be"} Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.961136 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" podUID="892610de-e4c4-4b99-a0ca-07fc0ad63df2" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.967219 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" event={"ID":"b4751629-75d2-4c2a-afb5-a7b7915cb644","Type":"ContainerStarted","Data":"b7950985e476fe5f0f2e4d02a8531aa204ae02f3956b6c82321336f3ff7f1c5d"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.967247 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" event={"ID":"b4751629-75d2-4c2a-afb5-a7b7915cb644","Type":"ContainerStarted","Data":"3471dc5eda7c7aae0e985061ae7b130cb11c32baa30a9937b938dd525a679780"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.968272 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh" event={"ID":"aab9e95e-6af6-483a-9cef-96a4accd24f9","Type":"ContainerStarted","Data":"5b6ebc54c667aafe7454028a576a1aff725a4e2ffcf1a2fac8c8e2deae0be624"} Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.968357 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f923b76c1dd8fde02a5faf8a0a433cfacfb7b743f371de64a12e30d6efcde254\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" podUID="b4751629-75d2-4c2a-afb5-a7b7915cb644" Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.975646 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" event={"ID":"c1244d8a-d20c-4318-9dfd-3617e35e54e9","Type":"ContainerStarted","Data":"b7a9caef8611cb25b54703acc1f05c445cf2bb8d477a1b6665915323d9b4252f"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.975688 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" event={"ID":"c1244d8a-d20c-4318-9dfd-3617e35e54e9","Type":"ContainerStarted","Data":"968747b1c796fc3875b8790476945369ff3abd1955878ce3a5f0fa8e87574f7c"} Oct 04 02:54:07 crc kubenswrapper[4964]: I1004 02:54:07.989328 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" Oct 04 02:54:07 crc kubenswrapper[4964]: E1004 02:54:07.993974 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" podUID="c1244d8a-d20c-4318-9dfd-3617e35e54e9" Oct 04 02:54:08 crc kubenswrapper[4964]: I1004 02:54:08.001818 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8" event={"ID":"7272252c-6b3a-4680-9f59-37bc87154be8","Type":"ContainerStarted","Data":"6149874c3c68847f3e70ef53494aff344b6c1d36ac2c772fe1a9e970e156e48b"} Oct 04 02:54:08 crc kubenswrapper[4964]: I1004 02:54:08.003119 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9" event={"ID":"082a9114-18e1-40d0-829e-f2758614e49b","Type":"ContainerStarted","Data":"d229926f973e0ebf0e0f6fcf88ba8eac31fd43934ae425d0476029381eed451a"} Oct 04 02:54:08 crc kubenswrapper[4964]: I1004 02:54:08.428251 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf"] Oct 04 02:54:09 crc kubenswrapper[4964]: E1004 02:54:09.010935 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" podUID="892610de-e4c4-4b99-a0ca-07fc0ad63df2" Oct 04 02:54:09 crc kubenswrapper[4964]: E1004 02:54:09.011575 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" podUID="36a7b704-074e-4b3c-a459-e55607c9f604" Oct 04 02:54:09 crc kubenswrapper[4964]: E1004 02:54:09.011665 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm" podUID="4b675799-2c4a-4167-bd37-0de27bc8861d" Oct 04 02:54:09 crc kubenswrapper[4964]: E1004 02:54:09.011687 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" podUID="e317dfbb-55c5-49ee-8e16-5bc0532e2dfb" Oct 04 02:54:09 crc kubenswrapper[4964]: E1004 02:54:09.011741 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f923b76c1dd8fde02a5faf8a0a433cfacfb7b743f371de64a12e30d6efcde254\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" podUID="b4751629-75d2-4c2a-afb5-a7b7915cb644" Oct 04 02:54:09 crc kubenswrapper[4964]: E1004 02:54:09.012715 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" podUID="e756e118-8e7e-4e1f-827d-cef4acdbb848" Oct 04 02:54:09 crc kubenswrapper[4964]: E1004 02:54:09.012767 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757\\\"\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" podUID="be3f98e4-03d2-46bb-b7fe-bc050255934c" Oct 04 02:54:09 crc kubenswrapper[4964]: E1004 02:54:09.012897 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" podUID="c1244d8a-d20c-4318-9dfd-3617e35e54e9" Oct 04 02:54:09 crc kubenswrapper[4964]: W1004 02:54:09.702300 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4274bd25_ba07_4036_80b8_86561c1a6f64.slice/crio-95aebbf4dbb4ec62ec914ba067b218e4db3dc6235f653730e4bfe9fdb11728bf WatchSource:0}: Error finding container 95aebbf4dbb4ec62ec914ba067b218e4db3dc6235f653730e4bfe9fdb11728bf: Status 404 returned error can't find the container with id 95aebbf4dbb4ec62ec914ba067b218e4db3dc6235f653730e4bfe9fdb11728bf Oct 04 02:54:10 crc kubenswrapper[4964]: I1004 02:54:10.016910 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" event={"ID":"4274bd25-ba07-4036-80b8-86561c1a6f64","Type":"ContainerStarted","Data":"95aebbf4dbb4ec62ec914ba067b218e4db3dc6235f653730e4bfe9fdb11728bf"} Oct 04 02:54:10 crc kubenswrapper[4964]: I1004 02:54:10.604926 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:54:10 crc kubenswrapper[4964]: I1004 02:54:10.645429 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4nhk4"] Oct 04 02:54:11 crc kubenswrapper[4964]: I1004 02:54:11.026368 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4nhk4" podUID="f0ac9809-0e8c-4579-813a-9acacca251d5" containerName="registry-server" containerID="cri-o://5fbfdae6eb44332f45b6c7c968787c985530813b2c9634fabafe3099915d8648" gracePeriod=2 Oct 04 02:54:11 crc kubenswrapper[4964]: I1004 02:54:11.563596 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:11 crc kubenswrapper[4964]: I1004 02:54:11.563661 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:11 crc kubenswrapper[4964]: I1004 02:54:11.615723 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:12 crc kubenswrapper[4964]: I1004 02:54:12.040929 4964 generic.go:334] "Generic (PLEG): container finished" podID="f0ac9809-0e8c-4579-813a-9acacca251d5" containerID="5fbfdae6eb44332f45b6c7c968787c985530813b2c9634fabafe3099915d8648" exitCode=0 Oct 04 02:54:12 crc kubenswrapper[4964]: I1004 02:54:12.040984 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhk4" event={"ID":"f0ac9809-0e8c-4579-813a-9acacca251d5","Type":"ContainerDied","Data":"5fbfdae6eb44332f45b6c7c968787c985530813b2c9634fabafe3099915d8648"} Oct 04 02:54:12 crc kubenswrapper[4964]: I1004 02:54:12.098610 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:13 crc kubenswrapper[4964]: I1004 02:54:13.236899 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwknv"] Oct 04 02:54:14 crc kubenswrapper[4964]: I1004 02:54:14.057028 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zwknv" podUID="72ebe95e-5523-4fa6-8a2a-c2c553f315b1" containerName="registry-server" containerID="cri-o://06bcb6f4f12ef71723138814d4c15774e1ffe64b1c8461b644c26b78fc22fb15" gracePeriod=2 Oct 04 02:54:16 crc kubenswrapper[4964]: I1004 02:54:16.078683 4964 generic.go:334] "Generic (PLEG): container finished" podID="72ebe95e-5523-4fa6-8a2a-c2c553f315b1" containerID="06bcb6f4f12ef71723138814d4c15774e1ffe64b1c8461b644c26b78fc22fb15" exitCode=0 Oct 04 02:54:16 crc kubenswrapper[4964]: I1004 02:54:16.078741 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwknv" event={"ID":"72ebe95e-5523-4fa6-8a2a-c2c553f315b1","Type":"ContainerDied","Data":"06bcb6f4f12ef71723138814d4c15774e1ffe64b1c8461b644c26b78fc22fb15"} Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.040169 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.096850 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwknv" event={"ID":"72ebe95e-5523-4fa6-8a2a-c2c553f315b1","Type":"ContainerDied","Data":"e1b710a04b9a802b23ecd70842cf1e0e81f3a3415d7a9ace80774089375a2723"} Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.096908 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1b710a04b9a802b23ecd70842cf1e0e81f3a3415d7a9ace80774089375a2723" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.099363 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" event={"ID":"4274bd25-ba07-4036-80b8-86561c1a6f64","Type":"ContainerStarted","Data":"5421b16328bfc92e6e56652ecd181439d99fff64333a55e2a082b9f44023738b"} Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.108020 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhk4" event={"ID":"f0ac9809-0e8c-4579-813a-9acacca251d5","Type":"ContainerDied","Data":"29d2a4d578a4ccd5dceacdeaf9873b326a1a40e7da0c8a7377d6d4b531f84767"} Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.108091 4964 scope.go:117] "RemoveContainer" containerID="5fbfdae6eb44332f45b6c7c968787c985530813b2c9634fabafe3099915d8648" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.108102 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4nhk4" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.119214 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ac9809-0e8c-4579-813a-9acacca251d5-utilities\") pod \"f0ac9809-0e8c-4579-813a-9acacca251d5\" (UID: \"f0ac9809-0e8c-4579-813a-9acacca251d5\") " Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.119466 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ac9809-0e8c-4579-813a-9acacca251d5-catalog-content\") pod \"f0ac9809-0e8c-4579-813a-9acacca251d5\" (UID: \"f0ac9809-0e8c-4579-813a-9acacca251d5\") " Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.119531 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5jnf\" (UniqueName: \"kubernetes.io/projected/f0ac9809-0e8c-4579-813a-9acacca251d5-kube-api-access-c5jnf\") pod \"f0ac9809-0e8c-4579-813a-9acacca251d5\" (UID: \"f0ac9809-0e8c-4579-813a-9acacca251d5\") " Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.124401 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ac9809-0e8c-4579-813a-9acacca251d5-kube-api-access-c5jnf" (OuterVolumeSpecName: "kube-api-access-c5jnf") pod "f0ac9809-0e8c-4579-813a-9acacca251d5" (UID: "f0ac9809-0e8c-4579-813a-9acacca251d5"). InnerVolumeSpecName "kube-api-access-c5jnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.127344 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ac9809-0e8c-4579-813a-9acacca251d5-utilities" (OuterVolumeSpecName: "utilities") pod "f0ac9809-0e8c-4579-813a-9acacca251d5" (UID: "f0ac9809-0e8c-4579-813a-9acacca251d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.163993 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.166000 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ac9809-0e8c-4579-813a-9acacca251d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0ac9809-0e8c-4579-813a-9acacca251d5" (UID: "f0ac9809-0e8c-4579-813a-9acacca251d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.169798 4964 scope.go:117] "RemoveContainer" containerID="ded72e26705cda3cc1edd32107058cbbd402456893b7bdb1287ddb5c072285e8" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.220535 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgxqq\" (UniqueName: \"kubernetes.io/projected/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-kube-api-access-mgxqq\") pod \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\" (UID: \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\") " Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.220574 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-utilities\") pod \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\" (UID: \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\") " Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.220603 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-catalog-content\") pod \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\" (UID: \"72ebe95e-5523-4fa6-8a2a-c2c553f315b1\") " Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.220840 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5jnf\" (UniqueName: \"kubernetes.io/projected/f0ac9809-0e8c-4579-813a-9acacca251d5-kube-api-access-c5jnf\") on node \"crc\" DevicePath \"\"" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.220851 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ac9809-0e8c-4579-813a-9acacca251d5-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.220860 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ac9809-0e8c-4579-813a-9acacca251d5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.221761 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-utilities" (OuterVolumeSpecName: "utilities") pod "72ebe95e-5523-4fa6-8a2a-c2c553f315b1" (UID: "72ebe95e-5523-4fa6-8a2a-c2c553f315b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.226720 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-kube-api-access-mgxqq" (OuterVolumeSpecName: "kube-api-access-mgxqq") pod "72ebe95e-5523-4fa6-8a2a-c2c553f315b1" (UID: "72ebe95e-5523-4fa6-8a2a-c2c553f315b1"). InnerVolumeSpecName "kube-api-access-mgxqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.267432 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72ebe95e-5523-4fa6-8a2a-c2c553f315b1" (UID: "72ebe95e-5523-4fa6-8a2a-c2c553f315b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.269868 4964 scope.go:117] "RemoveContainer" containerID="d7cf1820d5654d8daabf3e2a9ba455b663b94275811cc12170f6c0989ae0d3e6" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.321720 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgxqq\" (UniqueName: \"kubernetes.io/projected/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-kube-api-access-mgxqq\") on node \"crc\" DevicePath \"\"" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.321763 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.321778 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ebe95e-5523-4fa6-8a2a-c2c553f315b1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.455251 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4nhk4"] Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.459855 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4nhk4"] Oct 04 02:54:18 crc kubenswrapper[4964]: I1004 02:54:18.904973 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ac9809-0e8c-4579-813a-9acacca251d5" path="/var/lib/kubelet/pods/f0ac9809-0e8c-4579-813a-9acacca251d5/volumes" Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.117927 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv" event={"ID":"5d80f2c1-5570-48f8-908f-d580f7c7ecc7","Type":"ContainerStarted","Data":"23c87397bd582537b2caeb028c169659a20ff3ce5e68f3aaab4b9ea511b5d897"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.123319 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc" event={"ID":"7a706485-5c1c-4f12-854b-779a385023fe","Type":"ContainerStarted","Data":"08c18d9e0d0f95cf6bf030458e71f4b5c470ef51befde8e99322d3c560cbb813"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.124597 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj" event={"ID":"a9d92a9d-3e4a-4945-b449-4aed29708295","Type":"ContainerStarted","Data":"13f5c751e0ae9cb7b5c6f4615ea8c63c945c0f69b97a7555e08b9982bf0be5c5"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.126633 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" event={"ID":"762eb95d-bd98-4d86-8fc8-404234c0a13e","Type":"ContainerStarted","Data":"bac31473ed3226512082ea9b56a6bf2543d0a4ec898a0a5258dad389d520f6ef"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.136951 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd" event={"ID":"3fa08171-beb2-42d4-a751-fe46eb179a70","Type":"ContainerStarted","Data":"ff4948609a7c85471b8cc98acfdab0342ca287bb7a5e90bd184b1647fbc0e09a"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.143959 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv" event={"ID":"2611c21b-338e-4dc0-b977-e15067937730","Type":"ContainerStarted","Data":"939ea9ffa4cedecce8de1ede802c8970aacd81d1e02a78b7c8955b8d0f256c94"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.159216 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" event={"ID":"4274bd25-ba07-4036-80b8-86561c1a6f64","Type":"ContainerStarted","Data":"e2f84a973eaad66d75ddb79483ef76be124db4d67ff8fcae0fe92d0a86ae00f2"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.159782 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.165268 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9" event={"ID":"082a9114-18e1-40d0-829e-f2758614e49b","Type":"ContainerStarted","Data":"ba5b95808765b6b5e4440004d69b127bdb631d313490caed8559c8aa16aebe0c"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.165318 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9" event={"ID":"082a9114-18e1-40d0-829e-f2758614e49b","Type":"ContainerStarted","Data":"fb5c1d4ff10dac80ed36e5d99402616d229e49ee7742de59bc609d3f096cc346"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.165459 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9" Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.170357 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc" event={"ID":"35f77ec9-6ace-4f9c-ad47-6956e222902b","Type":"ContainerStarted","Data":"6e1925248ad1c8e72c5ff51d0e7180f02e84f49541ef2e260274b832c87b25ca"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.176170 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh" event={"ID":"aab9e95e-6af6-483a-9cef-96a4accd24f9","Type":"ContainerStarted","Data":"289df9c9359889664dd4c13002f508a01c93b3f7ba0be51c539f0f7c9ea88c4f"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.177334 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8" event={"ID":"7272252c-6b3a-4680-9f59-37bc87154be8","Type":"ContainerStarted","Data":"a8990e17dde8dfd1c9a5cd4214967dc2b2f57ba86a520f51d586d857a349617c"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.178679 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq" event={"ID":"71f63b59-61c1-43ae-8726-bdc38806ee71","Type":"ContainerStarted","Data":"3418ec3eb3c23cc5264df3f99ef02daa66c480f05343ff714cd602bb7a785851"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.180195 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj" event={"ID":"1ac8bb69-05a0-4faa-a294-5243e4a2e21a","Type":"ContainerStarted","Data":"034aa84718e17c8fe6818ee099feaec53c5d223dd20f46d536d88dd021506b68"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.181307 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj" event={"ID":"4c3ecd12-c2c1-48ba-b75d-e1df6fcb7a4a","Type":"ContainerStarted","Data":"90504cc1a76858b6a90f58aa2ad206cc39468fffb86cb6af44f38b512958bc5a"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.181334 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj" event={"ID":"4c3ecd12-c2c1-48ba-b75d-e1df6fcb7a4a","Type":"ContainerStarted","Data":"3fcc7a806a6ca17d131d0fc14d87a6b3f2eb6020d656d4c19118e2673380c6bc"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.181508 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj" Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.191507 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwknv" Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.191828 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt" event={"ID":"5ba7848e-5d4e-4de0-a0de-2a8bcd534c90","Type":"ContainerStarted","Data":"9ce5cedb5d3942c9c2e4b150ee07365d1abe3d0ff36f899ad1b9aac23b2ce83a"} Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.196229 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" podStartSLOduration=13.19621541 podStartE2EDuration="13.19621541s" podCreationTimestamp="2025-10-04 02:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:54:19.193960532 +0000 UTC m=+839.090919170" watchObservedRunningTime="2025-10-04 02:54:19.19621541 +0000 UTC m=+839.093174048" Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.226851 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9" podStartSLOduration=3.729997755 podStartE2EDuration="14.226836255s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.172658796 +0000 UTC m=+827.069617434" lastFinishedPulling="2025-10-04 02:54:17.669497296 +0000 UTC m=+837.566455934" observedRunningTime="2025-10-04 02:54:19.222951944 +0000 UTC m=+839.119910592" watchObservedRunningTime="2025-10-04 02:54:19.226836255 +0000 UTC m=+839.123794893" Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.247572 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj" podStartSLOduration=4.029223864 podStartE2EDuration="14.247553581s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.449523386 +0000 UTC m=+827.346482024" lastFinishedPulling="2025-10-04 02:54:17.667853093 +0000 UTC m=+837.564811741" observedRunningTime="2025-10-04 02:54:19.243638377 +0000 UTC m=+839.140597025" watchObservedRunningTime="2025-10-04 02:54:19.247553581 +0000 UTC m=+839.144512219" Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.270678 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwknv"] Oct 04 02:54:19 crc kubenswrapper[4964]: I1004 02:54:19.274753 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zwknv"] Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.085572 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f5xcw"] Oct 04 02:54:20 crc kubenswrapper[4964]: E1004 02:54:20.086603 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ebe95e-5523-4fa6-8a2a-c2c553f315b1" containerName="registry-server" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.086681 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ebe95e-5523-4fa6-8a2a-c2c553f315b1" containerName="registry-server" Oct 04 02:54:20 crc kubenswrapper[4964]: E1004 02:54:20.086712 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ac9809-0e8c-4579-813a-9acacca251d5" containerName="registry-server" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.086730 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ac9809-0e8c-4579-813a-9acacca251d5" containerName="registry-server" Oct 04 02:54:20 crc kubenswrapper[4964]: E1004 02:54:20.086768 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ac9809-0e8c-4579-813a-9acacca251d5" containerName="extract-content" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.086787 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ac9809-0e8c-4579-813a-9acacca251d5" containerName="extract-content" Oct 04 02:54:20 crc kubenswrapper[4964]: E1004 02:54:20.086809 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ac9809-0e8c-4579-813a-9acacca251d5" containerName="extract-utilities" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.086825 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ac9809-0e8c-4579-813a-9acacca251d5" containerName="extract-utilities" Oct 04 02:54:20 crc kubenswrapper[4964]: E1004 02:54:20.086858 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ebe95e-5523-4fa6-8a2a-c2c553f315b1" containerName="extract-content" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.086875 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ebe95e-5523-4fa6-8a2a-c2c553f315b1" containerName="extract-content" Oct 04 02:54:20 crc kubenswrapper[4964]: E1004 02:54:20.086906 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ebe95e-5523-4fa6-8a2a-c2c553f315b1" containerName="extract-utilities" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.086918 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ebe95e-5523-4fa6-8a2a-c2c553f315b1" containerName="extract-utilities" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.087184 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ac9809-0e8c-4579-813a-9acacca251d5" containerName="registry-server" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.087217 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ebe95e-5523-4fa6-8a2a-c2c553f315b1" containerName="registry-server" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.089194 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.100810 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f5xcw"] Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.200623 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8" event={"ID":"7272252c-6b3a-4680-9f59-37bc87154be8","Type":"ContainerStarted","Data":"8034bdaa8e8b461f67d4552368984765b24684047c72bdce2cd6bde6524f8a36"} Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.200700 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.202323 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv" event={"ID":"2611c21b-338e-4dc0-b977-e15067937730","Type":"ContainerStarted","Data":"e9058b308bf790f6d68117065742fba58ad22f826c531c2f66d3b11a7785ee36"} Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.202485 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.203702 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq" event={"ID":"71f63b59-61c1-43ae-8726-bdc38806ee71","Type":"ContainerStarted","Data":"479592717f7b7789ac4b3b18e9caa07a35c76f88089f8241aadf0f306a1f8d31"} Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.203822 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.205008 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj" event={"ID":"a9d92a9d-3e4a-4945-b449-4aed29708295","Type":"ContainerStarted","Data":"324ccc51ffe2ca00694a50c830a91824b6d6da411aacec22d5fe6094032d13e3"} Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.205105 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.206317 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc" event={"ID":"35f77ec9-6ace-4f9c-ad47-6956e222902b","Type":"ContainerStarted","Data":"efe42ab045508f72437d218f322b3699c02523f594e3b6d1fb9bc892576c9a22"} Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.206426 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.207836 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh" event={"ID":"aab9e95e-6af6-483a-9cef-96a4accd24f9","Type":"ContainerStarted","Data":"04f23e21672906b0fff82a63cf9a2f6c279b0962a52d0c318bc841ebfcff61a3"} Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.207897 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.209149 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt" event={"ID":"5ba7848e-5d4e-4de0-a0de-2a8bcd534c90","Type":"ContainerStarted","Data":"51e45ce4ee1e8836a4f3b8fa8815b69987637e518d3528f790123da370e4bb9f"} Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.209277 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.211094 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd" event={"ID":"3fa08171-beb2-42d4-a751-fe46eb179a70","Type":"ContainerStarted","Data":"fb9141d897f8d07f5fd972e306d351a85dfd943c8113a8d8661cd5b66ab3babe"} Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.211269 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.212889 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv" event={"ID":"5d80f2c1-5570-48f8-908f-d580f7c7ecc7","Type":"ContainerStarted","Data":"9fe20a928ea6049b4e467b07a47d012df44e886c294cb2454510864b490f9e2b"} Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.212970 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.214823 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj" event={"ID":"1ac8bb69-05a0-4faa-a294-5243e4a2e21a","Type":"ContainerStarted","Data":"abbc42f4995fcdc632d9df575fe773690351f7ffa391c304a3b586c912e74c6a"} Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.214914 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.216843 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc" event={"ID":"7a706485-5c1c-4f12-854b-779a385023fe","Type":"ContainerStarted","Data":"aa2a0f2bda6b1afe10814a47816e7950e90d1c203f817415f63ca919824b6a51"} Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.216960 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.218414 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" event={"ID":"762eb95d-bd98-4d86-8fc8-404234c0a13e","Type":"ContainerStarted","Data":"846dcc04c4ec170bd40d76e93cdfa005c5fd78014f7273ded881f0c0392606c5"} Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.246135 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-utilities\") pod \"redhat-operators-f5xcw\" (UID: \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\") " pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.246200 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-catalog-content\") pod \"redhat-operators-f5xcw\" (UID: \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\") " pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.247144 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqgl8\" (UniqueName: \"kubernetes.io/projected/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-kube-api-access-kqgl8\") pod \"redhat-operators-f5xcw\" (UID: \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\") " pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.250158 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8" podStartSLOduration=4.728254705 podStartE2EDuration="15.250144074s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.148496191 +0000 UTC m=+827.045454829" lastFinishedPulling="2025-10-04 02:54:17.67038556 +0000 UTC m=+837.567344198" observedRunningTime="2025-10-04 02:54:20.232255943 +0000 UTC m=+840.129214571" watchObservedRunningTime="2025-10-04 02:54:20.250144074 +0000 UTC m=+840.147102702" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.252718 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq" podStartSLOduration=4.658645663 podStartE2EDuration="15.25271329s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.116840918 +0000 UTC m=+827.013799556" lastFinishedPulling="2025-10-04 02:54:17.710908545 +0000 UTC m=+837.607867183" observedRunningTime="2025-10-04 02:54:20.247508244 +0000 UTC m=+840.144466882" watchObservedRunningTime="2025-10-04 02:54:20.25271329 +0000 UTC m=+840.149671928" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.265535 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh" podStartSLOduration=4.7972126280000005 podStartE2EDuration="15.265521148s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.208094828 +0000 UTC m=+827.105053466" lastFinishedPulling="2025-10-04 02:54:17.676403348 +0000 UTC m=+837.573361986" observedRunningTime="2025-10-04 02:54:20.262547089 +0000 UTC m=+840.159505727" watchObservedRunningTime="2025-10-04 02:54:20.265521148 +0000 UTC m=+840.162479786" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.284377 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv" podStartSLOduration=4.759573908 podStartE2EDuration="15.284362523s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.145185673 +0000 UTC m=+827.042144301" lastFinishedPulling="2025-10-04 02:54:17.669974278 +0000 UTC m=+837.566932916" observedRunningTime="2025-10-04 02:54:20.280364358 +0000 UTC m=+840.177323006" watchObservedRunningTime="2025-10-04 02:54:20.284362523 +0000 UTC m=+840.181321171" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.304287 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt" podStartSLOduration=4.810673292 podStartE2EDuration="15.304274497s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.213301465 +0000 UTC m=+827.110260103" lastFinishedPulling="2025-10-04 02:54:17.70690267 +0000 UTC m=+837.603861308" observedRunningTime="2025-10-04 02:54:20.298166136 +0000 UTC m=+840.195124784" watchObservedRunningTime="2025-10-04 02:54:20.304274497 +0000 UTC m=+840.201233145" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.325746 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc" podStartSLOduration=4.794934338 podStartE2EDuration="15.325732091s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.178712455 +0000 UTC m=+827.075671093" lastFinishedPulling="2025-10-04 02:54:17.709510198 +0000 UTC m=+837.606468846" observedRunningTime="2025-10-04 02:54:20.323141973 +0000 UTC m=+840.220100621" watchObservedRunningTime="2025-10-04 02:54:20.325732091 +0000 UTC m=+840.222690739" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.348540 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqgl8\" (UniqueName: \"kubernetes.io/projected/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-kube-api-access-kqgl8\") pod \"redhat-operators-f5xcw\" (UID: \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\") " pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.348766 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-utilities\") pod \"redhat-operators-f5xcw\" (UID: \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\") " pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.348932 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-catalog-content\") pod \"redhat-operators-f5xcw\" (UID: \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\") " pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.349693 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-catalog-content\") pod \"redhat-operators-f5xcw\" (UID: \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\") " pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.349731 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-utilities\") pod \"redhat-operators-f5xcw\" (UID: \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\") " pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.359765 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" podStartSLOduration=5.30838435 podStartE2EDuration="15.359751706s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.655939354 +0000 UTC m=+827.552897992" lastFinishedPulling="2025-10-04 02:54:17.70730669 +0000 UTC m=+837.604265348" observedRunningTime="2025-10-04 02:54:20.355606336 +0000 UTC m=+840.252564984" watchObservedRunningTime="2025-10-04 02:54:20.359751706 +0000 UTC m=+840.256710354" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.375434 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqgl8\" (UniqueName: \"kubernetes.io/projected/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-kube-api-access-kqgl8\") pod \"redhat-operators-f5xcw\" (UID: \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\") " pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.380243 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj" podStartSLOduration=4.73333083 podStartE2EDuration="15.380224494s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.058954307 +0000 UTC m=+826.955912945" lastFinishedPulling="2025-10-04 02:54:17.705847951 +0000 UTC m=+837.602806609" observedRunningTime="2025-10-04 02:54:20.376950538 +0000 UTC m=+840.273909186" watchObservedRunningTime="2025-10-04 02:54:20.380224494 +0000 UTC m=+840.277183132" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.400152 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc" podStartSLOduration=4.493379329 podStartE2EDuration="15.400138327s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:06.774064845 +0000 UTC m=+826.671023483" lastFinishedPulling="2025-10-04 02:54:17.680823833 +0000 UTC m=+837.577782481" observedRunningTime="2025-10-04 02:54:20.395454624 +0000 UTC m=+840.292413262" watchObservedRunningTime="2025-10-04 02:54:20.400138327 +0000 UTC m=+840.297096965" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.412765 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj" podStartSLOduration=4.501525273 podStartE2EDuration="15.412751969s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:06.797095761 +0000 UTC m=+826.694054399" lastFinishedPulling="2025-10-04 02:54:17.708322437 +0000 UTC m=+837.605281095" observedRunningTime="2025-10-04 02:54:20.410218433 +0000 UTC m=+840.307177071" watchObservedRunningTime="2025-10-04 02:54:20.412751969 +0000 UTC m=+840.309710607" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.421937 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.437955 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv" podStartSLOduration=4.927967867 podStartE2EDuration="15.437924081s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.201504405 +0000 UTC m=+827.098463043" lastFinishedPulling="2025-10-04 02:54:17.711460619 +0000 UTC m=+837.608419257" observedRunningTime="2025-10-04 02:54:20.435180679 +0000 UTC m=+840.332139317" watchObservedRunningTime="2025-10-04 02:54:20.437924081 +0000 UTC m=+840.334882719" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.450610 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd" podStartSLOduration=4.910894916 podStartE2EDuration="15.450592964s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.201086443 +0000 UTC m=+827.098045071" lastFinishedPulling="2025-10-04 02:54:17.740784481 +0000 UTC m=+837.637743119" observedRunningTime="2025-10-04 02:54:20.448640932 +0000 UTC m=+840.345599570" watchObservedRunningTime="2025-10-04 02:54:20.450592964 +0000 UTC m=+840.347551602" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.859476 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ebe95e-5523-4fa6-8a2a-c2c553f315b1" path="/var/lib/kubelet/pods/72ebe95e-5523-4fa6-8a2a-c2c553f315b1/volumes" Oct 04 02:54:20 crc kubenswrapper[4964]: I1004 02:54:20.864113 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f5xcw"] Oct 04 02:54:21 crc kubenswrapper[4964]: I1004 02:54:21.225485 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5xcw" event={"ID":"f1efba6a-676d-4b85-9ca5-2e6b3eae4978","Type":"ContainerStarted","Data":"6d90cd7c94e9ece8b6b414bf2f156371c02b871db21d6b9dc4c833422f2acacd"} Oct 04 02:54:21 crc kubenswrapper[4964]: I1004 02:54:21.227044 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" Oct 04 02:54:24 crc kubenswrapper[4964]: I1004 02:54:24.278875 4964 generic.go:334] "Generic (PLEG): container finished" podID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" containerID="1d18c07d77a8c7fbebcf5b4da0bd23817f40f68fa43373700bdbb248b8cf05b5" exitCode=0 Oct 04 02:54:24 crc kubenswrapper[4964]: I1004 02:54:24.278920 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5xcw" event={"ID":"f1efba6a-676d-4b85-9ca5-2e6b3eae4978","Type":"ContainerDied","Data":"1d18c07d77a8c7fbebcf5b4da0bd23817f40f68fa43373700bdbb248b8cf05b5"} Oct 04 02:54:25 crc kubenswrapper[4964]: I1004 02:54:25.702267 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-kcwbj" Oct 04 02:54:25 crc kubenswrapper[4964]: I1004 02:54:25.711266 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55cd88dfc-q48nc" Oct 04 02:54:25 crc kubenswrapper[4964]: I1004 02:54:25.743953 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-q7zh9" Oct 04 02:54:25 crc kubenswrapper[4964]: I1004 02:54:25.746887 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-mbdfq" Oct 04 02:54:25 crc kubenswrapper[4964]: I1004 02:54:25.805105 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-qpmdv" Oct 04 02:54:25 crc kubenswrapper[4964]: I1004 02:54:25.818326 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-8f8rc" Oct 04 02:54:25 crc kubenswrapper[4964]: I1004 02:54:25.869042 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-99sf8" Oct 04 02:54:25 crc kubenswrapper[4964]: I1004 02:54:25.902382 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-frmcj" Oct 04 02:54:26 crc kubenswrapper[4964]: I1004 02:54:26.065952 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh" Oct 04 02:54:26 crc kubenswrapper[4964]: I1004 02:54:26.172520 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-strsv" Oct 04 02:54:26 crc kubenswrapper[4964]: I1004 02:54:26.246103 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-wlzgd" Oct 04 02:54:26 crc kubenswrapper[4964]: I1004 02:54:26.285010 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hkmlt" Oct 04 02:54:26 crc kubenswrapper[4964]: I1004 02:54:26.397313 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-85ckj" Oct 04 02:54:26 crc kubenswrapper[4964]: I1004 02:54:26.866247 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm" Oct 04 02:54:27 crc kubenswrapper[4964]: I1004 02:54:27.997828 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-545dfb464d-g5kjf" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.317219 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm" event={"ID":"4b675799-2c4a-4167-bd37-0de27bc8861d","Type":"ContainerStarted","Data":"9e5dd21c3a548eca83e3dec3fb121c826346818fad88a2c5305fa93426eb1fa1"} Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.319293 4964 generic.go:334] "Generic (PLEG): container finished" podID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" containerID="25ba730abaee39813fa2e0b17c5fac1d749159e04313e3a8df5cd1ee28da3db9" exitCode=0 Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.319353 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5xcw" event={"ID":"f1efba6a-676d-4b85-9ca5-2e6b3eae4978","Type":"ContainerDied","Data":"25ba730abaee39813fa2e0b17c5fac1d749159e04313e3a8df5cd1ee28da3db9"} Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.321535 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" event={"ID":"36a7b704-074e-4b3c-a459-e55607c9f604","Type":"ContainerStarted","Data":"6d6a091395eb187d391c6b2d95282d0032d7f3b023ae9da4ba0caf777aba26c4"} Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.321793 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.323944 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" event={"ID":"892610de-e4c4-4b99-a0ca-07fc0ad63df2","Type":"ContainerStarted","Data":"e439b55cc816b1993d2e044612c360689a3567e84571a654e978b87db40a861a"} Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.324165 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.326239 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" event={"ID":"c1244d8a-d20c-4318-9dfd-3617e35e54e9","Type":"ContainerStarted","Data":"c892069310d08422a8e2c2e467344405ac97a46d9b50601829f87e6f57892a7d"} Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.326406 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.337802 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" event={"ID":"e756e118-8e7e-4e1f-827d-cef4acdbb848","Type":"ContainerStarted","Data":"8df02564ea0ad5fea67768a10d138a5ea7d0fe48ac08d843fd99b7cb4d562491"} Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.338816 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.342562 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" event={"ID":"be3f98e4-03d2-46bb-b7fe-bc050255934c","Type":"ContainerStarted","Data":"873e8a7573a11bbd52631a0bdccd0530ce2fa51c17001f39e1ce58181ccaf613"} Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.344313 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.346944 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" event={"ID":"b4751629-75d2-4c2a-afb5-a7b7915cb644","Type":"ContainerStarted","Data":"51700c6a8ae2ccfb698aac579b0d38792976b850e9ed6c5b7597680fff3936d9"} Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.347860 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.349842 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" event={"ID":"e317dfbb-55c5-49ee-8e16-5bc0532e2dfb","Type":"ContainerStarted","Data":"595b76eff21df4b71bd43ac006202696dfde575d27cb520e795eed97030f73df"} Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.350048 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.350203 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm" podStartSLOduration=2.879633791 podStartE2EDuration="23.350179411s" podCreationTimestamp="2025-10-04 02:54:06 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.486267082 +0000 UTC m=+827.383225720" lastFinishedPulling="2025-10-04 02:54:27.956812672 +0000 UTC m=+847.853771340" observedRunningTime="2025-10-04 02:54:29.339400927 +0000 UTC m=+849.236359565" watchObservedRunningTime="2025-10-04 02:54:29.350179411 +0000 UTC m=+849.247138049" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.363733 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" podStartSLOduration=3.575291829 podStartE2EDuration="24.363709378s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.215859492 +0000 UTC m=+827.112818130" lastFinishedPulling="2025-10-04 02:54:28.004277041 +0000 UTC m=+847.901235679" observedRunningTime="2025-10-04 02:54:29.358034588 +0000 UTC m=+849.254993226" watchObservedRunningTime="2025-10-04 02:54:29.363709378 +0000 UTC m=+849.260668016" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.376816 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" podStartSLOduration=3.8717457939999997 podStartE2EDuration="24.376795112s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.486017706 +0000 UTC m=+827.382976334" lastFinishedPulling="2025-10-04 02:54:27.991067014 +0000 UTC m=+847.888025652" observedRunningTime="2025-10-04 02:54:29.371087181 +0000 UTC m=+849.268045839" watchObservedRunningTime="2025-10-04 02:54:29.376795112 +0000 UTC m=+849.273753760" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.390684 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" podStartSLOduration=3.646957785 podStartE2EDuration="24.390667567s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.213582912 +0000 UTC m=+827.110541550" lastFinishedPulling="2025-10-04 02:54:27.957292694 +0000 UTC m=+847.854251332" observedRunningTime="2025-10-04 02:54:29.387770651 +0000 UTC m=+849.284729299" watchObservedRunningTime="2025-10-04 02:54:29.390667567 +0000 UTC m=+849.287626205" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.428232 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" podStartSLOduration=3.685352007 podStartE2EDuration="24.428208336s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.213808918 +0000 UTC m=+827.110767556" lastFinishedPulling="2025-10-04 02:54:27.956665247 +0000 UTC m=+847.853623885" observedRunningTime="2025-10-04 02:54:29.41810898 +0000 UTC m=+849.315067608" watchObservedRunningTime="2025-10-04 02:54:29.428208336 +0000 UTC m=+849.325166994" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.444327 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" podStartSLOduration=3.974930522 podStartE2EDuration="24.444306s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.487151556 +0000 UTC m=+827.384110194" lastFinishedPulling="2025-10-04 02:54:27.956527024 +0000 UTC m=+847.853485672" observedRunningTime="2025-10-04 02:54:29.441966958 +0000 UTC m=+849.338925596" watchObservedRunningTime="2025-10-04 02:54:29.444306 +0000 UTC m=+849.341264648" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.472044 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" podStartSLOduration=3.961092287 podStartE2EDuration="24.472020999s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.498937346 +0000 UTC m=+827.395895974" lastFinishedPulling="2025-10-04 02:54:28.009866058 +0000 UTC m=+847.906824686" observedRunningTime="2025-10-04 02:54:29.466408811 +0000 UTC m=+849.363367449" watchObservedRunningTime="2025-10-04 02:54:29.472020999 +0000 UTC m=+849.368979647" Oct 04 02:54:29 crc kubenswrapper[4964]: I1004 02:54:29.491348 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" podStartSLOduration=3.966589311 podStartE2EDuration="24.491322077s" podCreationTimestamp="2025-10-04 02:54:05 +0000 UTC" firstStartedPulling="2025-10-04 02:54:07.485843351 +0000 UTC m=+827.382801989" lastFinishedPulling="2025-10-04 02:54:28.010576117 +0000 UTC m=+847.907534755" observedRunningTime="2025-10-04 02:54:29.489289915 +0000 UTC m=+849.386248573" watchObservedRunningTime="2025-10-04 02:54:29.491322077 +0000 UTC m=+849.388280725" Oct 04 02:54:30 crc kubenswrapper[4964]: I1004 02:54:30.364153 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5xcw" event={"ID":"f1efba6a-676d-4b85-9ca5-2e6b3eae4978","Type":"ContainerStarted","Data":"c484799c88fe5b04b1aace2c7a3e823834f20d25ad2d7e6dd3d7813759ebad6d"} Oct 04 02:54:30 crc kubenswrapper[4964]: I1004 02:54:30.399159 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f5xcw" podStartSLOduration=5.151871962 podStartE2EDuration="10.399123961s" podCreationTimestamp="2025-10-04 02:54:20 +0000 UTC" firstStartedPulling="2025-10-04 02:54:24.475477002 +0000 UTC m=+844.372435640" lastFinishedPulling="2025-10-04 02:54:29.722728971 +0000 UTC m=+849.619687639" observedRunningTime="2025-10-04 02:54:30.389371825 +0000 UTC m=+850.286330473" watchObservedRunningTime="2025-10-04 02:54:30.399123961 +0000 UTC m=+850.296082639" Oct 04 02:54:30 crc kubenswrapper[4964]: I1004 02:54:30.422328 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:30 crc kubenswrapper[4964]: I1004 02:54:30.422385 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:31 crc kubenswrapper[4964]: I1004 02:54:31.503561 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f5xcw" podUID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" containerName="registry-server" probeResult="failure" output=< Oct 04 02:54:31 crc kubenswrapper[4964]: timeout: failed to connect service ":50051" within 1s Oct 04 02:54:31 crc kubenswrapper[4964]: > Oct 04 02:54:36 crc kubenswrapper[4964]: I1004 02:54:36.059535 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-sz72v" Oct 04 02:54:36 crc kubenswrapper[4964]: I1004 02:54:36.148550 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nkxrt" Oct 04 02:54:36 crc kubenswrapper[4964]: I1004 02:54:36.221207 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-qgrnf" Oct 04 02:54:36 crc kubenswrapper[4964]: I1004 02:54:36.273588 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dzqh8" Oct 04 02:54:36 crc kubenswrapper[4964]: I1004 02:54:36.310688 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vrfm9" Oct 04 02:54:36 crc kubenswrapper[4964]: I1004 02:54:36.462312 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n5sks" Oct 04 02:54:36 crc kubenswrapper[4964]: I1004 02:54:36.473350 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-frmfp" Oct 04 02:54:40 crc kubenswrapper[4964]: I1004 02:54:40.487996 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:40 crc kubenswrapper[4964]: I1004 02:54:40.556004 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:40 crc kubenswrapper[4964]: I1004 02:54:40.720291 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f5xcw"] Oct 04 02:54:42 crc kubenswrapper[4964]: I1004 02:54:42.469349 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f5xcw" podUID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" containerName="registry-server" containerID="cri-o://c484799c88fe5b04b1aace2c7a3e823834f20d25ad2d7e6dd3d7813759ebad6d" gracePeriod=2 Oct 04 02:54:43 crc kubenswrapper[4964]: I1004 02:54:43.481294 4964 generic.go:334] "Generic (PLEG): container finished" podID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" containerID="c484799c88fe5b04b1aace2c7a3e823834f20d25ad2d7e6dd3d7813759ebad6d" exitCode=0 Oct 04 02:54:43 crc kubenswrapper[4964]: I1004 02:54:43.481389 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5xcw" event={"ID":"f1efba6a-676d-4b85-9ca5-2e6b3eae4978","Type":"ContainerDied","Data":"c484799c88fe5b04b1aace2c7a3e823834f20d25ad2d7e6dd3d7813759ebad6d"} Oct 04 02:54:43 crc kubenswrapper[4964]: I1004 02:54:43.957747 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.012114 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-utilities\") pod \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\" (UID: \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\") " Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.012227 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-catalog-content\") pod \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\" (UID: \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\") " Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.012379 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqgl8\" (UniqueName: \"kubernetes.io/projected/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-kube-api-access-kqgl8\") pod \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\" (UID: \"f1efba6a-676d-4b85-9ca5-2e6b3eae4978\") " Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.013291 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-utilities" (OuterVolumeSpecName: "utilities") pod "f1efba6a-676d-4b85-9ca5-2e6b3eae4978" (UID: "f1efba6a-676d-4b85-9ca5-2e6b3eae4978"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.033920 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-kube-api-access-kqgl8" (OuterVolumeSpecName: "kube-api-access-kqgl8") pod "f1efba6a-676d-4b85-9ca5-2e6b3eae4978" (UID: "f1efba6a-676d-4b85-9ca5-2e6b3eae4978"). InnerVolumeSpecName "kube-api-access-kqgl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.091385 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1efba6a-676d-4b85-9ca5-2e6b3eae4978" (UID: "f1efba6a-676d-4b85-9ca5-2e6b3eae4978"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.113667 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqgl8\" (UniqueName: \"kubernetes.io/projected/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-kube-api-access-kqgl8\") on node \"crc\" DevicePath \"\"" Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.113706 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.113720 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1efba6a-676d-4b85-9ca5-2e6b3eae4978-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.493677 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5xcw" event={"ID":"f1efba6a-676d-4b85-9ca5-2e6b3eae4978","Type":"ContainerDied","Data":"6d90cd7c94e9ece8b6b414bf2f156371c02b871db21d6b9dc4c833422f2acacd"} Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.493747 4964 scope.go:117] "RemoveContainer" containerID="c484799c88fe5b04b1aace2c7a3e823834f20d25ad2d7e6dd3d7813759ebad6d" Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.493852 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5xcw" Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.511120 4964 scope.go:117] "RemoveContainer" containerID="25ba730abaee39813fa2e0b17c5fac1d749159e04313e3a8df5cd1ee28da3db9" Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.536790 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f5xcw"] Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.540891 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f5xcw"] Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.545327 4964 scope.go:117] "RemoveContainer" containerID="1d18c07d77a8c7fbebcf5b4da0bd23817f40f68fa43373700bdbb248b8cf05b5" Oct 04 02:54:44 crc kubenswrapper[4964]: I1004 02:54:44.856151 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" path="/var/lib/kubelet/pods/f1efba6a-676d-4b85-9ca5-2e6b3eae4978/volumes" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.017724 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xtlz2"] Oct 04 02:54:53 crc kubenswrapper[4964]: E1004 02:54:53.018469 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" containerName="extract-utilities" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.018483 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" containerName="extract-utilities" Oct 04 02:54:53 crc kubenswrapper[4964]: E1004 02:54:53.018517 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" containerName="extract-content" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.018525 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" containerName="extract-content" Oct 04 02:54:53 crc kubenswrapper[4964]: E1004 02:54:53.018645 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" containerName="registry-server" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.018656 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" containerName="registry-server" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.018824 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1efba6a-676d-4b85-9ca5-2e6b3eae4978" containerName="registry-server" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.020188 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.027151 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.027410 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-mr477" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.027514 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.027686 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.035960 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xtlz2"] Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.102048 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4cv2d"] Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.106222 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.115069 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.124178 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4cv2d"] Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.142171 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee-config\") pod \"dnsmasq-dns-675f4bcbfc-xtlz2\" (UID: \"52b3fe32-4ad9-4462-9aa5-a76c41e7ddee\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.142227 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng6bs\" (UniqueName: \"kubernetes.io/projected/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-kube-api-access-ng6bs\") pod \"dnsmasq-dns-78dd6ddcc-4cv2d\" (UID: \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.142249 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6s8m\" (UniqueName: \"kubernetes.io/projected/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee-kube-api-access-s6s8m\") pod \"dnsmasq-dns-675f4bcbfc-xtlz2\" (UID: \"52b3fe32-4ad9-4462-9aa5-a76c41e7ddee\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.142304 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-config\") pod \"dnsmasq-dns-78dd6ddcc-4cv2d\" (UID: \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.142328 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4cv2d\" (UID: \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.243273 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-config\") pod \"dnsmasq-dns-78dd6ddcc-4cv2d\" (UID: \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.243322 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4cv2d\" (UID: \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.243383 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee-config\") pod \"dnsmasq-dns-675f4bcbfc-xtlz2\" (UID: \"52b3fe32-4ad9-4462-9aa5-a76c41e7ddee\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.243409 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng6bs\" (UniqueName: \"kubernetes.io/projected/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-kube-api-access-ng6bs\") pod \"dnsmasq-dns-78dd6ddcc-4cv2d\" (UID: \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.243425 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6s8m\" (UniqueName: \"kubernetes.io/projected/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee-kube-api-access-s6s8m\") pod \"dnsmasq-dns-675f4bcbfc-xtlz2\" (UID: \"52b3fe32-4ad9-4462-9aa5-a76c41e7ddee\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.244178 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-config\") pod \"dnsmasq-dns-78dd6ddcc-4cv2d\" (UID: \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.244512 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4cv2d\" (UID: \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.244551 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee-config\") pod \"dnsmasq-dns-675f4bcbfc-xtlz2\" (UID: \"52b3fe32-4ad9-4462-9aa5-a76c41e7ddee\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.273315 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6s8m\" (UniqueName: \"kubernetes.io/projected/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee-kube-api-access-s6s8m\") pod \"dnsmasq-dns-675f4bcbfc-xtlz2\" (UID: \"52b3fe32-4ad9-4462-9aa5-a76c41e7ddee\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.273812 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng6bs\" (UniqueName: \"kubernetes.io/projected/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-kube-api-access-ng6bs\") pod \"dnsmasq-dns-78dd6ddcc-4cv2d\" (UID: \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.341031 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.430513 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.753663 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xtlz2"] Oct 04 02:54:53 crc kubenswrapper[4964]: W1004 02:54:53.761282 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52b3fe32_4ad9_4462_9aa5_a76c41e7ddee.slice/crio-af55b992df502c5c005fdf4c00abde1a902c711614797978596b1f3dc5465914 WatchSource:0}: Error finding container af55b992df502c5c005fdf4c00abde1a902c711614797978596b1f3dc5465914: Status 404 returned error can't find the container with id af55b992df502c5c005fdf4c00abde1a902c711614797978596b1f3dc5465914 Oct 04 02:54:53 crc kubenswrapper[4964]: I1004 02:54:53.902027 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4cv2d"] Oct 04 02:54:53 crc kubenswrapper[4964]: W1004 02:54:53.907696 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73dd89de_d81e_4dd0_afc7_f5d26dbeb2c4.slice/crio-1b31f25b0d4be150358074a339d1f19156a7054d771975b03bc73efc564d7245 WatchSource:0}: Error finding container 1b31f25b0d4be150358074a339d1f19156a7054d771975b03bc73efc564d7245: Status 404 returned error can't find the container with id 1b31f25b0d4be150358074a339d1f19156a7054d771975b03bc73efc564d7245 Oct 04 02:54:54 crc kubenswrapper[4964]: I1004 02:54:54.575305 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" event={"ID":"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4","Type":"ContainerStarted","Data":"1b31f25b0d4be150358074a339d1f19156a7054d771975b03bc73efc564d7245"} Oct 04 02:54:54 crc kubenswrapper[4964]: I1004 02:54:54.576839 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" event={"ID":"52b3fe32-4ad9-4462-9aa5-a76c41e7ddee","Type":"ContainerStarted","Data":"af55b992df502c5c005fdf4c00abde1a902c711614797978596b1f3dc5465914"} Oct 04 02:54:55 crc kubenswrapper[4964]: I1004 02:54:55.989940 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xtlz2"] Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.011036 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wsscc"] Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.012164 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.023893 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wsscc"] Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.082687 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b04f912-7d03-4866-b512-5b6fb5e4371f-config\") pod \"dnsmasq-dns-666b6646f7-wsscc\" (UID: \"9b04f912-7d03-4866-b512-5b6fb5e4371f\") " pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.082735 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b04f912-7d03-4866-b512-5b6fb5e4371f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-wsscc\" (UID: \"9b04f912-7d03-4866-b512-5b6fb5e4371f\") " pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.082753 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bgpx\" (UniqueName: \"kubernetes.io/projected/9b04f912-7d03-4866-b512-5b6fb5e4371f-kube-api-access-5bgpx\") pod \"dnsmasq-dns-666b6646f7-wsscc\" (UID: \"9b04f912-7d03-4866-b512-5b6fb5e4371f\") " pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.183644 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b04f912-7d03-4866-b512-5b6fb5e4371f-config\") pod \"dnsmasq-dns-666b6646f7-wsscc\" (UID: \"9b04f912-7d03-4866-b512-5b6fb5e4371f\") " pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.183705 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bgpx\" (UniqueName: \"kubernetes.io/projected/9b04f912-7d03-4866-b512-5b6fb5e4371f-kube-api-access-5bgpx\") pod \"dnsmasq-dns-666b6646f7-wsscc\" (UID: \"9b04f912-7d03-4866-b512-5b6fb5e4371f\") " pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.183724 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b04f912-7d03-4866-b512-5b6fb5e4371f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-wsscc\" (UID: \"9b04f912-7d03-4866-b512-5b6fb5e4371f\") " pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.184655 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b04f912-7d03-4866-b512-5b6fb5e4371f-config\") pod \"dnsmasq-dns-666b6646f7-wsscc\" (UID: \"9b04f912-7d03-4866-b512-5b6fb5e4371f\") " pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.189138 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b04f912-7d03-4866-b512-5b6fb5e4371f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-wsscc\" (UID: \"9b04f912-7d03-4866-b512-5b6fb5e4371f\") " pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.208370 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bgpx\" (UniqueName: \"kubernetes.io/projected/9b04f912-7d03-4866-b512-5b6fb5e4371f-kube-api-access-5bgpx\") pod \"dnsmasq-dns-666b6646f7-wsscc\" (UID: \"9b04f912-7d03-4866-b512-5b6fb5e4371f\") " pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.324760 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4cv2d"] Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.334363 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.355906 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wrd88"] Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.357210 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.374392 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wrd88"] Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.390903 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20df9f7-d92f-4c37-a96f-2afcdc14307c-config\") pod \"dnsmasq-dns-57d769cc4f-wrd88\" (UID: \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.391153 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl58p\" (UniqueName: \"kubernetes.io/projected/f20df9f7-d92f-4c37-a96f-2afcdc14307c-kube-api-access-pl58p\") pod \"dnsmasq-dns-57d769cc4f-wrd88\" (UID: \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.391178 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f20df9f7-d92f-4c37-a96f-2afcdc14307c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wrd88\" (UID: \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.492120 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20df9f7-d92f-4c37-a96f-2afcdc14307c-config\") pod \"dnsmasq-dns-57d769cc4f-wrd88\" (UID: \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.492202 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl58p\" (UniqueName: \"kubernetes.io/projected/f20df9f7-d92f-4c37-a96f-2afcdc14307c-kube-api-access-pl58p\") pod \"dnsmasq-dns-57d769cc4f-wrd88\" (UID: \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.492225 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f20df9f7-d92f-4c37-a96f-2afcdc14307c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wrd88\" (UID: \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.492961 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f20df9f7-d92f-4c37-a96f-2afcdc14307c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wrd88\" (UID: \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.493428 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20df9f7-d92f-4c37-a96f-2afcdc14307c-config\") pod \"dnsmasq-dns-57d769cc4f-wrd88\" (UID: \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.531658 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl58p\" (UniqueName: \"kubernetes.io/projected/f20df9f7-d92f-4c37-a96f-2afcdc14307c-kube-api-access-pl58p\") pod \"dnsmasq-dns-57d769cc4f-wrd88\" (UID: \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:54:56 crc kubenswrapper[4964]: I1004 02:54:56.671513 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.162975 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.164157 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.166643 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-k58qh" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.166916 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.167114 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.172360 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.173397 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.173764 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.173935 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.174095 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.204060 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.204133 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e87e52ad-66be-448b-b575-6d0acd8a8d4e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.204188 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e87e52ad-66be-448b-b575-6d0acd8a8d4e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.204220 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.204240 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-config-data\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.204907 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwrrf\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-kube-api-access-bwrrf\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.204990 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.205128 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.205305 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.205367 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.205393 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307020 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307064 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307087 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307118 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307133 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e87e52ad-66be-448b-b575-6d0acd8a8d4e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307151 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e87e52ad-66be-448b-b575-6d0acd8a8d4e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307321 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307338 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-config-data\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307356 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwrrf\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-kube-api-access-bwrrf\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307378 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307454 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307677 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.307808 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.308134 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.309431 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-config-data\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.310905 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.311449 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.314603 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.315372 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e87e52ad-66be-448b-b575-6d0acd8a8d4e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.321387 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.322424 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e87e52ad-66be-448b-b575-6d0acd8a8d4e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.323924 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwrrf\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-kube-api-access-bwrrf\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.329637 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.460359 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.462197 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.466436 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.466505 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.466605 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.466734 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.466760 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.470033 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.472058 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s5vz9" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.475390 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.525245 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.612806 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.612869 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.612918 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.613073 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.613355 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.613445 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.613534 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.613606 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.613712 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-766kx\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-kube-api-access-766kx\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.613752 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58ea849f-c48c-473c-8608-694d254c47cf-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.613802 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58ea849f-c48c-473c-8608-694d254c47cf-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.715248 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.715314 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.715381 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.715413 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.715471 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.715532 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.715574 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.715659 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-766kx\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-kube-api-access-766kx\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.715701 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58ea849f-c48c-473c-8608-694d254c47cf-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.715740 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58ea849f-c48c-473c-8608-694d254c47cf-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.715821 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.716550 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.717786 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.719561 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.721282 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.721435 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.728640 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.733496 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58ea849f-c48c-473c-8608-694d254c47cf-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.734488 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.735256 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.739993 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58ea849f-c48c-473c-8608-694d254c47cf-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.748048 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-766kx\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-kube-api-access-766kx\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.760786 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:54:57 crc kubenswrapper[4964]: I1004 02:54:57.806055 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.032716 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.036715 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.039313 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.049719 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.056648 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4g4z5" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.056858 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.058499 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.060827 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.066429 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.149212 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/79d85912-9774-4b24-bacc-13feb8d11ca4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.149256 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/79d85912-9774-4b24-bacc-13feb8d11ca4-config-data-default\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.149323 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/79d85912-9774-4b24-bacc-13feb8d11ca4-secrets\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.149359 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/79d85912-9774-4b24-bacc-13feb8d11ca4-kolla-config\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.149380 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d85912-9774-4b24-bacc-13feb8d11ca4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.149420 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.149439 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79d85912-9774-4b24-bacc-13feb8d11ca4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.149464 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5754\" (UniqueName: \"kubernetes.io/projected/79d85912-9774-4b24-bacc-13feb8d11ca4-kube-api-access-f5754\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.149492 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79d85912-9774-4b24-bacc-13feb8d11ca4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.151101 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.152318 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.157807 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.157817 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-qccp7" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.158807 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.166120 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.173186 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.251246 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.251309 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/79d85912-9774-4b24-bacc-13feb8d11ca4-secrets\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.251349 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/79d85912-9774-4b24-bacc-13feb8d11ca4-kolla-config\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.251375 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mczh2\" (UniqueName: \"kubernetes.io/projected/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-kube-api-access-mczh2\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.251403 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d85912-9774-4b24-bacc-13feb8d11ca4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.251425 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.251465 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.251497 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.251518 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79d85912-9774-4b24-bacc-13feb8d11ca4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.251544 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.252470 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5754\" (UniqueName: \"kubernetes.io/projected/79d85912-9774-4b24-bacc-13feb8d11ca4-kube-api-access-f5754\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.252576 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.252682 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79d85912-9774-4b24-bacc-13feb8d11ca4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.252770 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.252914 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.252960 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/79d85912-9774-4b24-bacc-13feb8d11ca4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.253034 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/79d85912-9774-4b24-bacc-13feb8d11ca4-config-data-default\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.253100 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/79d85912-9774-4b24-bacc-13feb8d11ca4-kolla-config\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.253119 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.257163 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79d85912-9774-4b24-bacc-13feb8d11ca4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.257940 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.258009 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79d85912-9774-4b24-bacc-13feb8d11ca4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.258174 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/79d85912-9774-4b24-bacc-13feb8d11ca4-secrets\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.258765 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/79d85912-9774-4b24-bacc-13feb8d11ca4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.258849 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/79d85912-9774-4b24-bacc-13feb8d11ca4-config-data-default\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.271227 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d85912-9774-4b24-bacc-13feb8d11ca4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.283332 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5754\" (UniqueName: \"kubernetes.io/projected/79d85912-9774-4b24-bacc-13feb8d11ca4-kube-api-access-f5754\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.286412 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"79d85912-9774-4b24-bacc-13feb8d11ca4\") " pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.354031 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mczh2\" (UniqueName: \"kubernetes.io/projected/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-kube-api-access-mczh2\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.354065 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.354099 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.354124 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.354145 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.354165 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.354191 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.354211 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.354244 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.354370 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.360566 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.361092 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.361753 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.363307 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.364948 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.375078 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.377762 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.378600 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.383020 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.383784 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mczh2\" (UniqueName: \"kubernetes.io/projected/0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6-kube-api-access-mczh2\") pod \"openstack-cell1-galera-0\" (UID: \"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6\") " pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.467460 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.773789 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.775362 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.780479 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-szhff" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.780680 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.780865 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.781408 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.862150 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a557673-3117-4fd4-8f34-28e1b7541c9c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.862227 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6a557673-3117-4fd4-8f34-28e1b7541c9c-kolla-config\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.862297 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a557673-3117-4fd4-8f34-28e1b7541c9c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.862388 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drq2q\" (UniqueName: \"kubernetes.io/projected/6a557673-3117-4fd4-8f34-28e1b7541c9c-kube-api-access-drq2q\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.862418 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a557673-3117-4fd4-8f34-28e1b7541c9c-config-data\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.963287 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6a557673-3117-4fd4-8f34-28e1b7541c9c-kolla-config\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.963366 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a557673-3117-4fd4-8f34-28e1b7541c9c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.963409 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drq2q\" (UniqueName: \"kubernetes.io/projected/6a557673-3117-4fd4-8f34-28e1b7541c9c-kube-api-access-drq2q\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.963431 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a557673-3117-4fd4-8f34-28e1b7541c9c-config-data\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.963455 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a557673-3117-4fd4-8f34-28e1b7541c9c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.964034 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6a557673-3117-4fd4-8f34-28e1b7541c9c-kolla-config\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.964391 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a557673-3117-4fd4-8f34-28e1b7541c9c-config-data\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.967351 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a557673-3117-4fd4-8f34-28e1b7541c9c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:00 crc kubenswrapper[4964]: I1004 02:55:00.968020 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a557673-3117-4fd4-8f34-28e1b7541c9c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:01 crc kubenswrapper[4964]: I1004 02:55:01.000607 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drq2q\" (UniqueName: \"kubernetes.io/projected/6a557673-3117-4fd4-8f34-28e1b7541c9c-kube-api-access-drq2q\") pod \"memcached-0\" (UID: \"6a557673-3117-4fd4-8f34-28e1b7541c9c\") " pod="openstack/memcached-0" Oct 04 02:55:01 crc kubenswrapper[4964]: I1004 02:55:01.101868 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 04 02:55:02 crc kubenswrapper[4964]: I1004 02:55:02.471538 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 02:55:02 crc kubenswrapper[4964]: I1004 02:55:02.474212 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 04 02:55:02 crc kubenswrapper[4964]: I1004 02:55:02.477558 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4q6fg" Oct 04 02:55:02 crc kubenswrapper[4964]: I1004 02:55:02.478328 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 02:55:02 crc kubenswrapper[4964]: I1004 02:55:02.589284 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75pht\" (UniqueName: \"kubernetes.io/projected/1b085f5a-7d46-4971-849a-3ff0f69cb179-kube-api-access-75pht\") pod \"kube-state-metrics-0\" (UID: \"1b085f5a-7d46-4971-849a-3ff0f69cb179\") " pod="openstack/kube-state-metrics-0" Oct 04 02:55:02 crc kubenswrapper[4964]: I1004 02:55:02.690480 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75pht\" (UniqueName: \"kubernetes.io/projected/1b085f5a-7d46-4971-849a-3ff0f69cb179-kube-api-access-75pht\") pod \"kube-state-metrics-0\" (UID: \"1b085f5a-7d46-4971-849a-3ff0f69cb179\") " pod="openstack/kube-state-metrics-0" Oct 04 02:55:02 crc kubenswrapper[4964]: I1004 02:55:02.709045 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75pht\" (UniqueName: \"kubernetes.io/projected/1b085f5a-7d46-4971-849a-3ff0f69cb179-kube-api-access-75pht\") pod \"kube-state-metrics-0\" (UID: \"1b085f5a-7d46-4971-849a-3ff0f69cb179\") " pod="openstack/kube-state-metrics-0" Oct 04 02:55:02 crc kubenswrapper[4964]: I1004 02:55:02.790331 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.310313 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.311825 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.314359 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.328277 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.328318 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.328541 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-sgpmp" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.329496 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.340274 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.463087 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71c6ccd3-a834-4a46-a25d-c92b7653c846-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.463129 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c6ccd3-a834-4a46-a25d-c92b7653c846-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.463151 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c6ccd3-a834-4a46-a25d-c92b7653c846-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.463179 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c6ccd3-a834-4a46-a25d-c92b7653c846-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.463285 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71c6ccd3-a834-4a46-a25d-c92b7653c846-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.463377 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71c6ccd3-a834-4a46-a25d-c92b7653c846-config\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.463405 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.463430 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5mj2\" (UniqueName: \"kubernetes.io/projected/71c6ccd3-a834-4a46-a25d-c92b7653c846-kube-api-access-l5mj2\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.564989 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71c6ccd3-a834-4a46-a25d-c92b7653c846-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.565035 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c6ccd3-a834-4a46-a25d-c92b7653c846-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.565054 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c6ccd3-a834-4a46-a25d-c92b7653c846-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.565075 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c6ccd3-a834-4a46-a25d-c92b7653c846-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.565096 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71c6ccd3-a834-4a46-a25d-c92b7653c846-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.565125 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71c6ccd3-a834-4a46-a25d-c92b7653c846-config\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.565149 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.565170 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5mj2\" (UniqueName: \"kubernetes.io/projected/71c6ccd3-a834-4a46-a25d-c92b7653c846-kube-api-access-l5mj2\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.566351 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71c6ccd3-a834-4a46-a25d-c92b7653c846-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.567129 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71c6ccd3-a834-4a46-a25d-c92b7653c846-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.568394 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71c6ccd3-a834-4a46-a25d-c92b7653c846-config\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.568661 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.570915 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c6ccd3-a834-4a46-a25d-c92b7653c846-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.571239 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c6ccd3-a834-4a46-a25d-c92b7653c846-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.573120 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c6ccd3-a834-4a46-a25d-c92b7653c846-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.581014 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5mj2\" (UniqueName: \"kubernetes.io/projected/71c6ccd3-a834-4a46-a25d-c92b7653c846-kube-api-access-l5mj2\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.605413 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"71c6ccd3-a834-4a46-a25d-c92b7653c846\") " pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.631236 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.671449 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-c6kng"] Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.672538 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.676572 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.677000 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.678593 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-fj9vk"] Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.680472 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.681239 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6b8gb" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.682659 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c6kng"] Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.703592 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fj9vk"] Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768590 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-var-log\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768692 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b37ef67d-6614-4b44-9435-a35a4939caf7-var-log-ovn\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768717 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37ef67d-6614-4b44-9435-a35a4939caf7-combined-ca-bundle\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768741 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b37ef67d-6614-4b44-9435-a35a4939caf7-scripts\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768765 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrj97\" (UniqueName: \"kubernetes.io/projected/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-kube-api-access-rrj97\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768787 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b37ef67d-6614-4b44-9435-a35a4939caf7-var-run-ovn\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768811 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37ef67d-6614-4b44-9435-a35a4939caf7-ovn-controller-tls-certs\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768867 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-var-lib\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768891 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-etc-ovs\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768925 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-var-run\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768949 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c76m\" (UniqueName: \"kubernetes.io/projected/b37ef67d-6614-4b44-9435-a35a4939caf7-kube-api-access-6c76m\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768976 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b37ef67d-6614-4b44-9435-a35a4939caf7-var-run\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.768990 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-scripts\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.870421 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-var-log\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.870525 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b37ef67d-6614-4b44-9435-a35a4939caf7-var-log-ovn\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.870567 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37ef67d-6614-4b44-9435-a35a4939caf7-combined-ca-bundle\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.870607 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrj97\" (UniqueName: \"kubernetes.io/projected/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-kube-api-access-rrj97\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.870661 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b37ef67d-6614-4b44-9435-a35a4939caf7-scripts\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.870701 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b37ef67d-6614-4b44-9435-a35a4939caf7-var-run-ovn\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.870773 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37ef67d-6614-4b44-9435-a35a4939caf7-ovn-controller-tls-certs\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.871292 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-var-lib\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.871342 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-etc-ovs\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.871397 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-var-run\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.871435 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c76m\" (UniqueName: \"kubernetes.io/projected/b37ef67d-6614-4b44-9435-a35a4939caf7-kube-api-access-6c76m\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.871497 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b37ef67d-6614-4b44-9435-a35a4939caf7-var-run\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.871517 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-var-lib\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.871529 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-scripts\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.871095 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-var-log\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.871631 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-var-run\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.871221 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b37ef67d-6614-4b44-9435-a35a4939caf7-var-run-ovn\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.871803 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-etc-ovs\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.871842 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b37ef67d-6614-4b44-9435-a35a4939caf7-var-run\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.872018 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b37ef67d-6614-4b44-9435-a35a4939caf7-var-log-ovn\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.874033 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-scripts\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.874369 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b37ef67d-6614-4b44-9435-a35a4939caf7-scripts\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.889166 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37ef67d-6614-4b44-9435-a35a4939caf7-ovn-controller-tls-certs\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.890164 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37ef67d-6614-4b44-9435-a35a4939caf7-combined-ca-bundle\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.897062 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c76m\" (UniqueName: \"kubernetes.io/projected/b37ef67d-6614-4b44-9435-a35a4939caf7-kube-api-access-6c76m\") pod \"ovn-controller-c6kng\" (UID: \"b37ef67d-6614-4b44-9435-a35a4939caf7\") " pod="openstack/ovn-controller-c6kng" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.901327 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrj97\" (UniqueName: \"kubernetes.io/projected/c45ccbc0-f08a-43ed-b80b-620fd961cb2d-kube-api-access-rrj97\") pod \"ovn-controller-ovs-fj9vk\" (UID: \"c45ccbc0-f08a-43ed-b80b-620fd961cb2d\") " pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:07 crc kubenswrapper[4964]: I1004 02:55:07.991314 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6kng" Oct 04 02:55:08 crc kubenswrapper[4964]: I1004 02:55:08.005050 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:09 crc kubenswrapper[4964]: E1004 02:55:09.276935 4964 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 04 02:55:09 crc kubenswrapper[4964]: E1004 02:55:09.277362 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s6s8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-xtlz2_openstack(52b3fe32-4ad9-4462-9aa5-a76c41e7ddee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 02:55:09 crc kubenswrapper[4964]: E1004 02:55:09.279362 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" podUID="52b3fe32-4ad9-4462-9aa5-a76c41e7ddee" Oct 04 02:55:09 crc kubenswrapper[4964]: E1004 02:55:09.292203 4964 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 04 02:55:09 crc kubenswrapper[4964]: E1004 02:55:09.292349 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ng6bs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4cv2d_openstack(73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 02:55:09 crc kubenswrapper[4964]: E1004 02:55:09.293744 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" podUID="73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.436179 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.437609 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.441466 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.441666 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.441797 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.446830 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-2m89v" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.472890 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.497387 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.497435 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-config\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.497459 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll8b7\" (UniqueName: \"kubernetes.io/projected/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-kube-api-access-ll8b7\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.497654 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.497777 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.497818 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.497893 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.497939 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.601704 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.601768 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.601796 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.601825 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.601852 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-config\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.601873 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll8b7\" (UniqueName: \"kubernetes.io/projected/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-kube-api-access-ll8b7\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.601915 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.601945 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.602815 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.603803 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-config\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.604821 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.605032 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.610327 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.613200 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.613732 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.633832 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll8b7\" (UniqueName: \"kubernetes.io/projected/2f1fa594-fc65-4b98-9fad-a3e2cb027eba-kube-api-access-ll8b7\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.647565 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2f1fa594-fc65-4b98-9fad-a3e2cb027eba\") " pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:09 crc kubenswrapper[4964]: I1004 02:55:09.765237 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.117947 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.126245 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.217550 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng6bs\" (UniqueName: \"kubernetes.io/projected/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-kube-api-access-ng6bs\") pod \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\" (UID: \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\") " Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.217647 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee-config\") pod \"52b3fe32-4ad9-4462-9aa5-a76c41e7ddee\" (UID: \"52b3fe32-4ad9-4462-9aa5-a76c41e7ddee\") " Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.217705 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-config\") pod \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\" (UID: \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\") " Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.217765 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6s8m\" (UniqueName: \"kubernetes.io/projected/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee-kube-api-access-s6s8m\") pod \"52b3fe32-4ad9-4462-9aa5-a76c41e7ddee\" (UID: \"52b3fe32-4ad9-4462-9aa5-a76c41e7ddee\") " Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.217793 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-dns-svc\") pod \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\" (UID: \"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4\") " Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.218296 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee-config" (OuterVolumeSpecName: "config") pod "52b3fe32-4ad9-4462-9aa5-a76c41e7ddee" (UID: "52b3fe32-4ad9-4462-9aa5-a76c41e7ddee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.219189 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4" (UID: "73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.221430 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-config" (OuterVolumeSpecName: "config") pod "73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4" (UID: "73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.223725 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-kube-api-access-ng6bs" (OuterVolumeSpecName: "kube-api-access-ng6bs") pod "73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4" (UID: "73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4"). InnerVolumeSpecName "kube-api-access-ng6bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.223790 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee-kube-api-access-s6s8m" (OuterVolumeSpecName: "kube-api-access-s6s8m") pod "52b3fe32-4ad9-4462-9aa5-a76c41e7ddee" (UID: "52b3fe32-4ad9-4462-9aa5-a76c41e7ddee"). InnerVolumeSpecName "kube-api-access-s6s8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.228720 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wsscc"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.230445 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.257862 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c6kng"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.280845 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wrd88"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.297329 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.313129 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.319349 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng6bs\" (UniqueName: \"kubernetes.io/projected/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-kube-api-access-ng6bs\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.319378 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.319389 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.319400 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6s8m\" (UniqueName: \"kubernetes.io/projected/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee-kube-api-access-s6s8m\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.319409 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.475337 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.483089 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.490163 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.505094 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.571462 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fj9vk"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.690034 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 04 02:55:10 crc kubenswrapper[4964]: W1004 02:55:10.695750 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f1fa594_fc65_4b98_9fad_a3e2cb027eba.slice/crio-d91822d836439d5d2098d19e5c28520a79eacd1e65b7113909f8ad020fbb6d8e WatchSource:0}: Error finding container d91822d836439d5d2098d19e5c28520a79eacd1e65b7113909f8ad020fbb6d8e: Status 404 returned error can't find the container with id d91822d836439d5d2098d19e5c28520a79eacd1e65b7113909f8ad020fbb6d8e Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.722466 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e87e52ad-66be-448b-b575-6d0acd8a8d4e","Type":"ContainerStarted","Data":"8c7b8f127f3fcdaba27bb59295f0f09db30a21fee9c9b59fc229353188048afc"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.725231 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"79d85912-9774-4b24-bacc-13feb8d11ca4","Type":"ContainerStarted","Data":"ec906db8d37bec9a4dfad503e82ca6e51f4d3ab6498617013364c715302f06a4"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.727606 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6a557673-3117-4fd4-8f34-28e1b7541c9c","Type":"ContainerStarted","Data":"8f58e122a6287f1e049098e23641d8646f06eba4c9ac6a4738a785136409f8b4"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.728781 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2f1fa594-fc65-4b98-9fad-a3e2cb027eba","Type":"ContainerStarted","Data":"d91822d836439d5d2098d19e5c28520a79eacd1e65b7113909f8ad020fbb6d8e"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.730398 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6kng" event={"ID":"b37ef67d-6614-4b44-9435-a35a4939caf7","Type":"ContainerStarted","Data":"13384add9fa7149856a3b74ed855e2c5769a641aa4fd81b2489e4db39af73f79"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.731787 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" event={"ID":"f20df9f7-d92f-4c37-a96f-2afcdc14307c","Type":"ContainerStarted","Data":"2046cdd8bb7f7c10338a6c97dda7a11e164549e8ed48087c1182018e0083c6f2"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.732992 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fj9vk" event={"ID":"c45ccbc0-f08a-43ed-b80b-620fd961cb2d","Type":"ContainerStarted","Data":"771e257e383b38ef48da3a0234760d251547cf568cbf2c72db0398f564fd414b"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.734173 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" event={"ID":"52b3fe32-4ad9-4462-9aa5-a76c41e7ddee","Type":"ContainerDied","Data":"af55b992df502c5c005fdf4c00abde1a902c711614797978596b1f3dc5465914"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.734240 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-xtlz2" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.735931 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"58ea849f-c48c-473c-8608-694d254c47cf","Type":"ContainerStarted","Data":"8b896c1c70823e0cf0b70ac85c19f1df0914f0fd7997c93ec60f1501a4db1da9"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.737153 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6","Type":"ContainerStarted","Data":"c2fdc02aa8c2a6802a15b0a9279303cfe04d57b19b3384a33795443d488d3424"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.738347 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b085f5a-7d46-4971-849a-3ff0f69cb179","Type":"ContainerStarted","Data":"7fa0051ddcd32f37ca720d4880df1dcf603c77f40d6380a73cb4751d6ebdbc0e"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.739252 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" event={"ID":"9b04f912-7d03-4866-b512-5b6fb5e4371f","Type":"ContainerStarted","Data":"30fe4d055846317296ef4468e6233c2b140268a3c9b2d76e7b30a2bbaf2f932d"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.740916 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" event={"ID":"73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4","Type":"ContainerDied","Data":"1b31f25b0d4be150358074a339d1f19156a7054d771975b03bc73efc564d7245"} Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.740959 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4cv2d" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.826782 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xtlz2"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.834720 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xtlz2"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.862689 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b3fe32-4ad9-4462-9aa5-a76c41e7ddee" path="/var/lib/kubelet/pods/52b3fe32-4ad9-4462-9aa5-a76c41e7ddee/volumes" Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.863267 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4cv2d"] Oct 04 02:55:10 crc kubenswrapper[4964]: I1004 02:55:10.863292 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4cv2d"] Oct 04 02:55:11 crc kubenswrapper[4964]: I1004 02:55:11.103497 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 04 02:55:11 crc kubenswrapper[4964]: W1004 02:55:11.108384 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71c6ccd3_a834_4a46_a25d_c92b7653c846.slice/crio-fa7da33f6b78b02ffe16b9833965478636048701301d04d9a028469624737516 WatchSource:0}: Error finding container fa7da33f6b78b02ffe16b9833965478636048701301d04d9a028469624737516: Status 404 returned error can't find the container with id fa7da33f6b78b02ffe16b9833965478636048701301d04d9a028469624737516 Oct 04 02:55:11 crc kubenswrapper[4964]: I1004 02:55:11.759266 4964 generic.go:334] "Generic (PLEG): container finished" podID="f20df9f7-d92f-4c37-a96f-2afcdc14307c" containerID="0c98b598f99716598a23610f9c7687bc3d970191911fc235c6eb420653fbb66d" exitCode=0 Oct 04 02:55:11 crc kubenswrapper[4964]: I1004 02:55:11.760347 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" event={"ID":"f20df9f7-d92f-4c37-a96f-2afcdc14307c","Type":"ContainerDied","Data":"0c98b598f99716598a23610f9c7687bc3d970191911fc235c6eb420653fbb66d"} Oct 04 02:55:11 crc kubenswrapper[4964]: I1004 02:55:11.762575 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"71c6ccd3-a834-4a46-a25d-c92b7653c846","Type":"ContainerStarted","Data":"fa7da33f6b78b02ffe16b9833965478636048701301d04d9a028469624737516"} Oct 04 02:55:11 crc kubenswrapper[4964]: I1004 02:55:11.764536 4964 generic.go:334] "Generic (PLEG): container finished" podID="9b04f912-7d03-4866-b512-5b6fb5e4371f" containerID="0f955b95d05aca988608a6ccff8b6e19fc438095b990d0dca76c753c35052c46" exitCode=0 Oct 04 02:55:11 crc kubenswrapper[4964]: I1004 02:55:11.764564 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" event={"ID":"9b04f912-7d03-4866-b512-5b6fb5e4371f","Type":"ContainerDied","Data":"0f955b95d05aca988608a6ccff8b6e19fc438095b990d0dca76c753c35052c46"} Oct 04 02:55:12 crc kubenswrapper[4964]: I1004 02:55:12.859449 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4" path="/var/lib/kubelet/pods/73dd89de-d81e-4dd0-afc7-f5d26dbeb2c4/volumes" Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.874487 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"71c6ccd3-a834-4a46-a25d-c92b7653c846","Type":"ContainerStarted","Data":"ece89e364161c226c3e479dae79275944bc6cebbeb20e61211e2546cdbb13c1e"} Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.874874 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"79d85912-9774-4b24-bacc-13feb8d11ca4","Type":"ContainerStarted","Data":"c602ad2e3f3241a3b51884e7701b4a7dcb8743d74a91b133e72715b5dbb972e5"} Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.880752 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" event={"ID":"9b04f912-7d03-4866-b512-5b6fb5e4371f","Type":"ContainerStarted","Data":"2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e"} Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.880790 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.885559 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6a557673-3117-4fd4-8f34-28e1b7541c9c","Type":"ContainerStarted","Data":"12e4a2a026ee8c991ffa59e7a26393e3cdbd202e594608b90ddda4d0f29b0fbe"} Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.885789 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.888099 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2f1fa594-fc65-4b98-9fad-a3e2cb027eba","Type":"ContainerStarted","Data":"6aa358a08eb537c5cb4f3242d05397f176579101e8c70665bca78d61521dc800"} Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.891937 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" event={"ID":"f20df9f7-d92f-4c37-a96f-2afcdc14307c","Type":"ContainerStarted","Data":"bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6"} Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.894210 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.901224 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6","Type":"ContainerStarted","Data":"10a38cc703f69d502df3401d8707e313520a01ddf639725c3951f596b97d2cad"} Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.915139 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.083752287 podStartE2EDuration="18.91512518s" podCreationTimestamp="2025-10-04 02:55:00 +0000 UTC" firstStartedPulling="2025-10-04 02:55:10.483003768 +0000 UTC m=+890.379962406" lastFinishedPulling="2025-10-04 02:55:18.314376651 +0000 UTC m=+898.211335299" observedRunningTime="2025-10-04 02:55:18.912123801 +0000 UTC m=+898.809082439" watchObservedRunningTime="2025-10-04 02:55:18.91512518 +0000 UTC m=+898.812083818" Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.930069 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" podStartSLOduration=22.513157204 podStartE2EDuration="22.930052332s" podCreationTimestamp="2025-10-04 02:54:56 +0000 UTC" firstStartedPulling="2025-10-04 02:55:10.239131566 +0000 UTC m=+890.136090204" lastFinishedPulling="2025-10-04 02:55:10.656026694 +0000 UTC m=+890.552985332" observedRunningTime="2025-10-04 02:55:18.929061817 +0000 UTC m=+898.826020465" watchObservedRunningTime="2025-10-04 02:55:18.930052332 +0000 UTC m=+898.827010960" Oct 04 02:55:18 crc kubenswrapper[4964]: I1004 02:55:18.952502 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" podStartSLOduration=23.469768092 podStartE2EDuration="23.952485223s" podCreationTimestamp="2025-10-04 02:54:55 +0000 UTC" firstStartedPulling="2025-10-04 02:55:10.23016306 +0000 UTC m=+890.127121698" lastFinishedPulling="2025-10-04 02:55:10.712880181 +0000 UTC m=+890.609838829" observedRunningTime="2025-10-04 02:55:18.948081528 +0000 UTC m=+898.845040156" watchObservedRunningTime="2025-10-04 02:55:18.952485223 +0000 UTC m=+898.849443861" Oct 04 02:55:19 crc kubenswrapper[4964]: I1004 02:55:19.912597 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b085f5a-7d46-4971-849a-3ff0f69cb179","Type":"ContainerStarted","Data":"405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc"} Oct 04 02:55:19 crc kubenswrapper[4964]: I1004 02:55:19.912939 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 04 02:55:19 crc kubenswrapper[4964]: I1004 02:55:19.931921 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6kng" event={"ID":"b37ef67d-6614-4b44-9435-a35a4939caf7","Type":"ContainerStarted","Data":"aeb64bdd684d5d39e28a570d0682275460d138c3067f4b2198f892a3dcb98ddc"} Oct 04 02:55:19 crc kubenswrapper[4964]: I1004 02:55:19.932419 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-c6kng" Oct 04 02:55:19 crc kubenswrapper[4964]: I1004 02:55:19.936235 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.010066038 podStartE2EDuration="17.936214246s" podCreationTimestamp="2025-10-04 02:55:02 +0000 UTC" firstStartedPulling="2025-10-04 02:55:10.493871364 +0000 UTC m=+890.390830012" lastFinishedPulling="2025-10-04 02:55:18.420019582 +0000 UTC m=+898.316978220" observedRunningTime="2025-10-04 02:55:19.931124873 +0000 UTC m=+899.828083541" watchObservedRunningTime="2025-10-04 02:55:19.936214246 +0000 UTC m=+899.833172884" Oct 04 02:55:19 crc kubenswrapper[4964]: I1004 02:55:19.938271 4964 generic.go:334] "Generic (PLEG): container finished" podID="c45ccbc0-f08a-43ed-b80b-620fd961cb2d" containerID="6273dbb8bdece50d4bbc4101f708a736b5a5566ea50f06ebfaebeef52d2ee8ff" exitCode=0 Oct 04 02:55:19 crc kubenswrapper[4964]: I1004 02:55:19.938384 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fj9vk" event={"ID":"c45ccbc0-f08a-43ed-b80b-620fd961cb2d","Type":"ContainerDied","Data":"6273dbb8bdece50d4bbc4101f708a736b5a5566ea50f06ebfaebeef52d2ee8ff"} Oct 04 02:55:19 crc kubenswrapper[4964]: I1004 02:55:19.961999 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-c6kng" podStartSLOduration=6.342880853 podStartE2EDuration="12.961969105s" podCreationTimestamp="2025-10-04 02:55:07 +0000 UTC" firstStartedPulling="2025-10-04 02:55:10.234981696 +0000 UTC m=+890.131940334" lastFinishedPulling="2025-10-04 02:55:16.854069948 +0000 UTC m=+896.751028586" observedRunningTime="2025-10-04 02:55:19.95987373 +0000 UTC m=+899.856832418" watchObservedRunningTime="2025-10-04 02:55:19.961969105 +0000 UTC m=+899.858927783" Oct 04 02:55:22 crc kubenswrapper[4964]: I1004 02:55:22.967594 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"58ea849f-c48c-473c-8608-694d254c47cf","Type":"ContainerStarted","Data":"a520d0e8af5bf9aed07472b58ea6c9fdd00d5af00a084474f9d20db6d7bc2e60"} Oct 04 02:55:24 crc kubenswrapper[4964]: I1004 02:55:24.000500 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fj9vk" event={"ID":"c45ccbc0-f08a-43ed-b80b-620fd961cb2d","Type":"ContainerStarted","Data":"10962ce881e3d6539632583dbb298dc10f5f72c56dddac4a60f1f220ec78782f"} Oct 04 02:55:25 crc kubenswrapper[4964]: I1004 02:55:25.015934 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e87e52ad-66be-448b-b575-6d0acd8a8d4e","Type":"ContainerStarted","Data":"08db805bf8b88d305b35ca233a5a28aab7ae81c637374c238029a67c96628722"} Oct 04 02:55:25 crc kubenswrapper[4964]: I1004 02:55:25.025737 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fj9vk" event={"ID":"c45ccbc0-f08a-43ed-b80b-620fd961cb2d","Type":"ContainerStarted","Data":"965e866b5f784dcdbe84981835e70740439f352c8e913f5853b46d1f25fb83e8"} Oct 04 02:55:25 crc kubenswrapper[4964]: I1004 02:55:25.025771 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:25 crc kubenswrapper[4964]: I1004 02:55:25.025794 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:25 crc kubenswrapper[4964]: I1004 02:55:25.085563 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-fj9vk" podStartSLOduration=10.859653757 podStartE2EDuration="18.085518497s" podCreationTimestamp="2025-10-04 02:55:07 +0000 UTC" firstStartedPulling="2025-10-04 02:55:10.573022128 +0000 UTC m=+890.469980766" lastFinishedPulling="2025-10-04 02:55:17.798886838 +0000 UTC m=+897.695845506" observedRunningTime="2025-10-04 02:55:25.080939606 +0000 UTC m=+904.977898274" watchObservedRunningTime="2025-10-04 02:55:25.085518497 +0000 UTC m=+904.982477145" Oct 04 02:55:26 crc kubenswrapper[4964]: I1004 02:55:26.042775 4964 generic.go:334] "Generic (PLEG): container finished" podID="0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6" containerID="10a38cc703f69d502df3401d8707e313520a01ddf639725c3951f596b97d2cad" exitCode=0 Oct 04 02:55:26 crc kubenswrapper[4964]: I1004 02:55:26.042908 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6","Type":"ContainerDied","Data":"10a38cc703f69d502df3401d8707e313520a01ddf639725c3951f596b97d2cad"} Oct 04 02:55:26 crc kubenswrapper[4964]: I1004 02:55:26.049074 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"71c6ccd3-a834-4a46-a25d-c92b7653c846","Type":"ContainerStarted","Data":"3f5de4666f134759934dca8c2b140f80f67c47fd592eb12329fe3c9b7fc378c5"} Oct 04 02:55:26 crc kubenswrapper[4964]: I1004 02:55:26.053955 4964 generic.go:334] "Generic (PLEG): container finished" podID="79d85912-9774-4b24-bacc-13feb8d11ca4" containerID="c602ad2e3f3241a3b51884e7701b4a7dcb8743d74a91b133e72715b5dbb972e5" exitCode=0 Oct 04 02:55:26 crc kubenswrapper[4964]: I1004 02:55:26.054027 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"79d85912-9774-4b24-bacc-13feb8d11ca4","Type":"ContainerDied","Data":"c602ad2e3f3241a3b51884e7701b4a7dcb8743d74a91b133e72715b5dbb972e5"} Oct 04 02:55:26 crc kubenswrapper[4964]: I1004 02:55:26.059773 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2f1fa594-fc65-4b98-9fad-a3e2cb027eba","Type":"ContainerStarted","Data":"59daeb1a230a699e798eeabcf9111051e932a71cea72d71c47fbbe3b2cd1c78b"} Oct 04 02:55:26 crc kubenswrapper[4964]: I1004 02:55:26.103325 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 04 02:55:26 crc kubenswrapper[4964]: I1004 02:55:26.103445 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.109134995 podStartE2EDuration="18.10343597s" podCreationTimestamp="2025-10-04 02:55:08 +0000 UTC" firstStartedPulling="2025-10-04 02:55:10.709242965 +0000 UTC m=+890.606201603" lastFinishedPulling="2025-10-04 02:55:25.70354393 +0000 UTC m=+905.600502578" observedRunningTime="2025-10-04 02:55:26.102373752 +0000 UTC m=+905.999332420" watchObservedRunningTime="2025-10-04 02:55:26.10343597 +0000 UTC m=+906.000394608" Oct 04 02:55:26 crc kubenswrapper[4964]: I1004 02:55:26.155700 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.54261444 podStartE2EDuration="20.155673496s" podCreationTimestamp="2025-10-04 02:55:06 +0000 UTC" firstStartedPulling="2025-10-04 02:55:11.110884121 +0000 UTC m=+891.007842759" lastFinishedPulling="2025-10-04 02:55:25.723943177 +0000 UTC m=+905.620901815" observedRunningTime="2025-10-04 02:55:26.148957008 +0000 UTC m=+906.045915656" watchObservedRunningTime="2025-10-04 02:55:26.155673496 +0000 UTC m=+906.052632154" Oct 04 02:55:26 crc kubenswrapper[4964]: I1004 02:55:26.336770 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:55:26 crc kubenswrapper[4964]: I1004 02:55:26.673904 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:55:26 crc kubenswrapper[4964]: I1004 02:55:26.761559 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wsscc"] Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.072452 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6","Type":"ContainerStarted","Data":"1b6282097914e40822c3d4cce7c6e36ea26ac1b4b5b34ec4317812d2227a7d65"} Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.075201 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"79d85912-9774-4b24-bacc-13feb8d11ca4","Type":"ContainerStarted","Data":"8961fff42677e85fe9a04529a43d40839e364cc62d9f04d921eeebbe35cdecbc"} Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.075393 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" podUID="9b04f912-7d03-4866-b512-5b6fb5e4371f" containerName="dnsmasq-dns" containerID="cri-o://2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e" gracePeriod=10 Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.125815 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.690056356 podStartE2EDuration="28.12578542s" podCreationTimestamp="2025-10-04 02:54:59 +0000 UTC" firstStartedPulling="2025-10-04 02:55:10.483195333 +0000 UTC m=+890.380153971" lastFinishedPulling="2025-10-04 02:55:17.918924387 +0000 UTC m=+897.815883035" observedRunningTime="2025-10-04 02:55:27.10073012 +0000 UTC m=+906.997688798" watchObservedRunningTime="2025-10-04 02:55:27.12578542 +0000 UTC m=+907.022744138" Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.146023 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.800555686 podStartE2EDuration="28.146003412s" podCreationTimestamp="2025-10-04 02:54:59 +0000 UTC" firstStartedPulling="2025-10-04 02:55:10.482799532 +0000 UTC m=+890.379758180" lastFinishedPulling="2025-10-04 02:55:16.828247268 +0000 UTC m=+896.725205906" observedRunningTime="2025-10-04 02:55:27.132540787 +0000 UTC m=+907.029499435" watchObservedRunningTime="2025-10-04 02:55:27.146003412 +0000 UTC m=+907.042962060" Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.536125 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.555267 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b04f912-7d03-4866-b512-5b6fb5e4371f-config\") pod \"9b04f912-7d03-4866-b512-5b6fb5e4371f\" (UID: \"9b04f912-7d03-4866-b512-5b6fb5e4371f\") " Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.555394 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b04f912-7d03-4866-b512-5b6fb5e4371f-dns-svc\") pod \"9b04f912-7d03-4866-b512-5b6fb5e4371f\" (UID: \"9b04f912-7d03-4866-b512-5b6fb5e4371f\") " Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.555433 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bgpx\" (UniqueName: \"kubernetes.io/projected/9b04f912-7d03-4866-b512-5b6fb5e4371f-kube-api-access-5bgpx\") pod \"9b04f912-7d03-4866-b512-5b6fb5e4371f\" (UID: \"9b04f912-7d03-4866-b512-5b6fb5e4371f\") " Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.603903 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b04f912-7d03-4866-b512-5b6fb5e4371f-kube-api-access-5bgpx" (OuterVolumeSpecName: "kube-api-access-5bgpx") pod "9b04f912-7d03-4866-b512-5b6fb5e4371f" (UID: "9b04f912-7d03-4866-b512-5b6fb5e4371f"). InnerVolumeSpecName "kube-api-access-5bgpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.633840 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.634458 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b04f912-7d03-4866-b512-5b6fb5e4371f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b04f912-7d03-4866-b512-5b6fb5e4371f" (UID: "9b04f912-7d03-4866-b512-5b6fb5e4371f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.654867 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b04f912-7d03-4866-b512-5b6fb5e4371f-config" (OuterVolumeSpecName: "config") pod "9b04f912-7d03-4866-b512-5b6fb5e4371f" (UID: "9b04f912-7d03-4866-b512-5b6fb5e4371f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.656757 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b04f912-7d03-4866-b512-5b6fb5e4371f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.656825 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bgpx\" (UniqueName: \"kubernetes.io/projected/9b04f912-7d03-4866-b512-5b6fb5e4371f-kube-api-access-5bgpx\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.656839 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b04f912-7d03-4866-b512-5b6fb5e4371f-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.766351 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:27 crc kubenswrapper[4964]: I1004 02:55:27.812553 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.088880 4964 generic.go:334] "Generic (PLEG): container finished" podID="9b04f912-7d03-4866-b512-5b6fb5e4371f" containerID="2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e" exitCode=0 Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.090551 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.091739 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" event={"ID":"9b04f912-7d03-4866-b512-5b6fb5e4371f","Type":"ContainerDied","Data":"2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e"} Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.091862 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.091937 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wsscc" event={"ID":"9b04f912-7d03-4866-b512-5b6fb5e4371f","Type":"ContainerDied","Data":"30fe4d055846317296ef4468e6233c2b140268a3c9b2d76e7b30a2bbaf2f932d"} Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.092006 4964 scope.go:117] "RemoveContainer" containerID="2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.122481 4964 scope.go:117] "RemoveContainer" containerID="0f955b95d05aca988608a6ccff8b6e19fc438095b990d0dca76c753c35052c46" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.148867 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wsscc"] Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.154351 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wsscc"] Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.156456 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.180798 4964 scope.go:117] "RemoveContainer" containerID="2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e" Oct 04 02:55:28 crc kubenswrapper[4964]: E1004 02:55:28.181453 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e\": container with ID starting with 2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e not found: ID does not exist" containerID="2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.181495 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e"} err="failed to get container status \"2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e\": rpc error: code = NotFound desc = could not find container \"2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e\": container with ID starting with 2df1326293fbba502bc45403667ca150d4fb423c28c0358432a81f7563708c4e not found: ID does not exist" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.181521 4964 scope.go:117] "RemoveContainer" containerID="0f955b95d05aca988608a6ccff8b6e19fc438095b990d0dca76c753c35052c46" Oct 04 02:55:28 crc kubenswrapper[4964]: E1004 02:55:28.181918 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f955b95d05aca988608a6ccff8b6e19fc438095b990d0dca76c753c35052c46\": container with ID starting with 0f955b95d05aca988608a6ccff8b6e19fc438095b990d0dca76c753c35052c46 not found: ID does not exist" containerID="0f955b95d05aca988608a6ccff8b6e19fc438095b990d0dca76c753c35052c46" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.181944 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f955b95d05aca988608a6ccff8b6e19fc438095b990d0dca76c753c35052c46"} err="failed to get container status \"0f955b95d05aca988608a6ccff8b6e19fc438095b990d0dca76c753c35052c46\": rpc error: code = NotFound desc = could not find container \"0f955b95d05aca988608a6ccff8b6e19fc438095b990d0dca76c753c35052c46\": container with ID starting with 0f955b95d05aca988608a6ccff8b6e19fc438095b990d0dca76c753c35052c46 not found: ID does not exist" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.417351 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-pfm6g"] Oct 04 02:55:28 crc kubenswrapper[4964]: E1004 02:55:28.417875 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b04f912-7d03-4866-b512-5b6fb5e4371f" containerName="dnsmasq-dns" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.417951 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b04f912-7d03-4866-b512-5b6fb5e4371f" containerName="dnsmasq-dns" Oct 04 02:55:28 crc kubenswrapper[4964]: E1004 02:55:28.418016 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b04f912-7d03-4866-b512-5b6fb5e4371f" containerName="init" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.418076 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b04f912-7d03-4866-b512-5b6fb5e4371f" containerName="init" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.418304 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b04f912-7d03-4866-b512-5b6fb5e4371f" containerName="dnsmasq-dns" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.419330 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:28 crc kubenswrapper[4964]: W1004 02:55:28.421174 4964 reflector.go:561] object-"openstack"/"ovsdbserver-nb": failed to list *v1.ConfigMap: configmaps "ovsdbserver-nb" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 04 02:55:28 crc kubenswrapper[4964]: E1004 02:55:28.421214 4964 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"ovsdbserver-nb\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovsdbserver-nb\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.433709 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-pfm6g"] Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.469320 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.469365 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-config\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.469396 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlj9z\" (UniqueName: \"kubernetes.io/projected/36c705f3-d087-4b79-9660-a680faa00e88-kube-api-access-hlj9z\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.469453 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.480652 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-69tgq"] Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.481545 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.483491 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.493815 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-69tgq"] Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.572648 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6f812e77-519b-4703-8215-d16a2cb188dd-ovs-rundir\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.572691 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f812e77-519b-4703-8215-d16a2cb188dd-config\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.572755 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-config\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.572781 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlj9z\" (UniqueName: \"kubernetes.io/projected/36c705f3-d087-4b79-9660-a680faa00e88-kube-api-access-hlj9z\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.572865 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6f812e77-519b-4703-8215-d16a2cb188dd-ovn-rundir\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.572958 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.573183 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8trln\" (UniqueName: \"kubernetes.io/projected/6f812e77-519b-4703-8215-d16a2cb188dd-kube-api-access-8trln\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.573213 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f812e77-519b-4703-8215-d16a2cb188dd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.573260 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f812e77-519b-4703-8215-d16a2cb188dd-combined-ca-bundle\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.576971 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.577963 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-config\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.578996 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.600901 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-pfm6g"] Oct 04 02:55:28 crc kubenswrapper[4964]: E1004 02:55:28.601456 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-hlj9z ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" podUID="36c705f3-d087-4b79-9660-a680faa00e88" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.614873 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlj9z\" (UniqueName: \"kubernetes.io/projected/36c705f3-d087-4b79-9660-a680faa00e88-kube-api-access-hlj9z\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.632554 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zgcfk"] Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.635171 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.635548 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.636842 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.645050 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zgcfk"] Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.675182 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.678331 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.678451 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6f812e77-519b-4703-8215-d16a2cb188dd-ovn-rundir\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.678566 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnqjk\" (UniqueName: \"kubernetes.io/projected/1e2229e9-e432-454f-8133-9d94f1a785a9-kube-api-access-dnqjk\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.678698 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.678751 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.678791 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6f812e77-519b-4703-8215-d16a2cb188dd-ovn-rundir\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.678796 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-config\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.678849 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8trln\" (UniqueName: \"kubernetes.io/projected/6f812e77-519b-4703-8215-d16a2cb188dd-kube-api-access-8trln\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.678875 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f812e77-519b-4703-8215-d16a2cb188dd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.678901 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f812e77-519b-4703-8215-d16a2cb188dd-combined-ca-bundle\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.678936 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6f812e77-519b-4703-8215-d16a2cb188dd-ovs-rundir\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.678954 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f812e77-519b-4703-8215-d16a2cb188dd-config\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.679212 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6f812e77-519b-4703-8215-d16a2cb188dd-ovs-rundir\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.679602 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f812e77-519b-4703-8215-d16a2cb188dd-config\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.682291 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f812e77-519b-4703-8215-d16a2cb188dd-combined-ca-bundle\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.682597 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f812e77-519b-4703-8215-d16a2cb188dd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.701944 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8trln\" (UniqueName: \"kubernetes.io/projected/6f812e77-519b-4703-8215-d16a2cb188dd-kube-api-access-8trln\") pod \"ovn-controller-metrics-69tgq\" (UID: \"6f812e77-519b-4703-8215-d16a2cb188dd\") " pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.780162 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnqjk\" (UniqueName: \"kubernetes.io/projected/1e2229e9-e432-454f-8133-9d94f1a785a9-kube-api-access-dnqjk\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.780347 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.780387 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.780441 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-config\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.780678 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.781806 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.782949 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.783809 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-config\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.804472 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-69tgq" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.810085 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnqjk\" (UniqueName: \"kubernetes.io/projected/1e2229e9-e432-454f-8133-9d94f1a785a9-kube-api-access-dnqjk\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:28 crc kubenswrapper[4964]: I1004 02:55:28.855755 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b04f912-7d03-4866-b512-5b6fb5e4371f" path="/var/lib/kubelet/pods/9b04f912-7d03-4866-b512-5b6fb5e4371f/volumes" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.096695 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.115412 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.140348 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.241646 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-69tgq"] Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.289279 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-dns-svc\") pod \"36c705f3-d087-4b79-9660-a680faa00e88\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.289985 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "36c705f3-d087-4b79-9660-a680faa00e88" (UID: "36c705f3-d087-4b79-9660-a680faa00e88"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.290218 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlj9z\" (UniqueName: \"kubernetes.io/projected/36c705f3-d087-4b79-9660-a680faa00e88-kube-api-access-hlj9z\") pod \"36c705f3-d087-4b79-9660-a680faa00e88\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.290454 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-config\") pod \"36c705f3-d087-4b79-9660-a680faa00e88\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.292013 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-config" (OuterVolumeSpecName: "config") pod "36c705f3-d087-4b79-9660-a680faa00e88" (UID: "36c705f3-d087-4b79-9660-a680faa00e88"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.292709 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.294171 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.296056 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c705f3-d087-4b79-9660-a680faa00e88-kube-api-access-hlj9z" (OuterVolumeSpecName: "kube-api-access-hlj9z") pod "36c705f3-d087-4b79-9660-a680faa00e88" (UID: "36c705f3-d087-4b79-9660-a680faa00e88"). InnerVolumeSpecName "kube-api-access-hlj9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.353477 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.354819 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.356532 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.358916 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.359329 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.359858 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hgv9w" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.385381 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.396108 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlj9z\" (UniqueName: \"kubernetes.io/projected/36c705f3-d087-4b79-9660-a680faa00e88-kube-api-access-hlj9z\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.498029 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.498108 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-scripts\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.498294 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-config\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.498345 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.498588 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.498681 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.498718 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h869\" (UniqueName: \"kubernetes.io/projected/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-kube-api-access-5h869\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: E1004 02:55:29.578198 4964 configmap.go:193] Couldn't get configMap openstack/ovsdbserver-nb: failed to sync configmap cache: timed out waiting for the condition Oct 04 02:55:29 crc kubenswrapper[4964]: E1004 02:55:29.578292 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-ovsdbserver-nb podName:36c705f3-d087-4b79-9660-a680faa00e88 nodeName:}" failed. No retries permitted until 2025-10-04 02:55:30.07827221 +0000 UTC m=+909.975230848 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-ovsdbserver-nb") pod "dnsmasq-dns-7fd796d7df-pfm6g" (UID: "36c705f3-d087-4b79-9660-a680faa00e88") : failed to sync configmap cache: timed out waiting for the condition Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.600323 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.600428 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-scripts\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.600499 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-config\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.600532 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.600657 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.601469 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.601521 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h869\" (UniqueName: \"kubernetes.io/projected/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-kube-api-access-5h869\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.601598 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-config\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.601649 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-scripts\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.602085 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.605889 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.606274 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.606280 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.621290 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h869\" (UniqueName: \"kubernetes.io/projected/e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4-kube-api-access-5h869\") pod \"ovn-northd-0\" (UID: \"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4\") " pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.688118 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 04 02:55:29 crc kubenswrapper[4964]: E1004 02:55:29.784891 4964 configmap.go:193] Couldn't get configMap openstack/ovsdbserver-nb: failed to sync configmap cache: timed out waiting for the condition Oct 04 02:55:29 crc kubenswrapper[4964]: E1004 02:55:29.785163 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-nb podName:1e2229e9-e432-454f-8133-9d94f1a785a9 nodeName:}" failed. No retries permitted until 2025-10-04 02:55:30.285146053 +0000 UTC m=+910.182104691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-nb") pod "dnsmasq-dns-86db49b7ff-zgcfk" (UID: "1e2229e9-e432-454f-8133-9d94f1a785a9") : failed to sync configmap cache: timed out waiting for the condition Oct 04 02:55:29 crc kubenswrapper[4964]: I1004 02:55:29.802195 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.107814 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-69tgq" event={"ID":"6f812e77-519b-4703-8215-d16a2cb188dd","Type":"ContainerStarted","Data":"9c917601ebae99888c73486a9e2863c28ec36971e8a7006c32f018d110a4c2cf"} Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.107871 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-69tgq" event={"ID":"6f812e77-519b-4703-8215-d16a2cb188dd","Type":"ContainerStarted","Data":"b3ed944ee5cf478c1b235bb23ee6bc156e2802f1c6aa74f0d721c7fef4036470"} Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.107830 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.109462 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.110330 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-pfm6g\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfm6g" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.135338 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.136517 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-69tgq" podStartSLOduration=2.136496882 podStartE2EDuration="2.136496882s" podCreationTimestamp="2025-10-04 02:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:55:30.129732289 +0000 UTC m=+910.026690937" watchObservedRunningTime="2025-10-04 02:55:30.136496882 +0000 UTC m=+910.033455530" Oct 04 02:55:30 crc kubenswrapper[4964]: W1004 02:55:30.166767 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e3c67f_6de6_4a44_a3a3_9ca24a141ac4.slice/crio-6c067cee6d7ba65fb92f2a182fbd1fe13467edf6265a2d467cab21ab25728349 WatchSource:0}: Error finding container 6c067cee6d7ba65fb92f2a182fbd1fe13467edf6265a2d467cab21ab25728349: Status 404 returned error can't find the container with id 6c067cee6d7ba65fb92f2a182fbd1fe13467edf6265a2d467cab21ab25728349 Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.191216 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-pfm6g"] Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.201181 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-pfm6g"] Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.216389 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-ovsdbserver-nb\") pod \"36c705f3-d087-4b79-9660-a680faa00e88\" (UID: \"36c705f3-d087-4b79-9660-a680faa00e88\") " Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.216916 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "36c705f3-d087-4b79-9660-a680faa00e88" (UID: "36c705f3-d087-4b79-9660-a680faa00e88"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.217231 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36c705f3-d087-4b79-9660-a680faa00e88-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.319423 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.320417 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zgcfk\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.362175 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.362241 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.467811 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.468378 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.474827 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.856230 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c705f3-d087-4b79-9660-a680faa00e88" path="/var/lib/kubelet/pods/36c705f3-d087-4b79-9660-a680faa00e88/volumes" Oct 04 02:55:30 crc kubenswrapper[4964]: I1004 02:55:30.969595 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zgcfk"] Oct 04 02:55:31 crc kubenswrapper[4964]: I1004 02:55:31.117579 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4","Type":"ContainerStarted","Data":"6c067cee6d7ba65fb92f2a182fbd1fe13467edf6265a2d467cab21ab25728349"} Oct 04 02:55:31 crc kubenswrapper[4964]: E1004 02:55:31.123692 4964 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.75:40710->38.102.83.75:32799: write tcp 38.102.83.75:40710->38.102.83.75:32799: write: broken pipe Oct 04 02:55:31 crc kubenswrapper[4964]: E1004 02:55:31.123687 4964 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.75:40710->38.102.83.75:32799: read tcp 38.102.83.75:40710->38.102.83.75:32799: read: connection reset by peer Oct 04 02:55:32 crc kubenswrapper[4964]: I1004 02:55:32.129130 4964 generic.go:334] "Generic (PLEG): container finished" podID="1e2229e9-e432-454f-8133-9d94f1a785a9" containerID="a055067fe903878e90cc3a9242df9a82579ab70bb8ecc46d63a6c8be4b9d3d4c" exitCode=0 Oct 04 02:55:32 crc kubenswrapper[4964]: I1004 02:55:32.129272 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" event={"ID":"1e2229e9-e432-454f-8133-9d94f1a785a9","Type":"ContainerDied","Data":"a055067fe903878e90cc3a9242df9a82579ab70bb8ecc46d63a6c8be4b9d3d4c"} Oct 04 02:55:32 crc kubenswrapper[4964]: I1004 02:55:32.130538 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" event={"ID":"1e2229e9-e432-454f-8133-9d94f1a785a9","Type":"ContainerStarted","Data":"e70d872d6c86a3cb39a33157572d260c7a351f7ed2e7798394d74efa31ef7e67"} Oct 04 02:55:32 crc kubenswrapper[4964]: I1004 02:55:32.134257 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4","Type":"ContainerStarted","Data":"6a70e56aaafc5042270854aa839d555599e1f41a49ea1dac4be53f2d43f6dcfa"} Oct 04 02:55:32 crc kubenswrapper[4964]: I1004 02:55:32.134338 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4","Type":"ContainerStarted","Data":"f553cec1833030ad510e34b54c0da096363c0e9a2b6f7a55ee44bcd926aef46f"} Oct 04 02:55:32 crc kubenswrapper[4964]: I1004 02:55:32.134663 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 04 02:55:32 crc kubenswrapper[4964]: I1004 02:55:32.206418 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.168678696 podStartE2EDuration="3.206390301s" podCreationTimestamp="2025-10-04 02:55:29 +0000 UTC" firstStartedPulling="2025-10-04 02:55:30.168671816 +0000 UTC m=+910.065630464" lastFinishedPulling="2025-10-04 02:55:31.206383421 +0000 UTC m=+911.103342069" observedRunningTime="2025-10-04 02:55:32.195217531 +0000 UTC m=+912.092176209" watchObservedRunningTime="2025-10-04 02:55:32.206390301 +0000 UTC m=+912.103348969" Oct 04 02:55:32 crc kubenswrapper[4964]: I1004 02:55:32.795121 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 04 02:55:33 crc kubenswrapper[4964]: I1004 02:55:33.078193 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:33 crc kubenswrapper[4964]: I1004 02:55:33.123605 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 04 02:55:33 crc kubenswrapper[4964]: I1004 02:55:33.142987 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" event={"ID":"1e2229e9-e432-454f-8133-9d94f1a785a9","Type":"ContainerStarted","Data":"d98a05a86de0c64a89db8bac4ed4c51a3b592c63391ed898921e437950d7f0e3"} Oct 04 02:55:33 crc kubenswrapper[4964]: I1004 02:55:33.176293 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" podStartSLOduration=5.176278552 podStartE2EDuration="5.176278552s" podCreationTimestamp="2025-10-04 02:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:55:33.171743049 +0000 UTC m=+913.068701687" watchObservedRunningTime="2025-10-04 02:55:33.176278552 +0000 UTC m=+913.073237190" Oct 04 02:55:34 crc kubenswrapper[4964]: I1004 02:55:34.153132 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:34 crc kubenswrapper[4964]: I1004 02:55:34.449222 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:55:34 crc kubenswrapper[4964]: I1004 02:55:34.449361 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:55:36 crc kubenswrapper[4964]: I1004 02:55:36.460791 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 04 02:55:36 crc kubenswrapper[4964]: I1004 02:55:36.510037 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 04 02:55:40 crc kubenswrapper[4964]: I1004 02:55:40.531870 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:55:40 crc kubenswrapper[4964]: I1004 02:55:40.604075 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wrd88"] Oct 04 02:55:40 crc kubenswrapper[4964]: I1004 02:55:40.604325 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" podUID="f20df9f7-d92f-4c37-a96f-2afcdc14307c" containerName="dnsmasq-dns" containerID="cri-o://bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6" gracePeriod=10 Oct 04 02:55:40 crc kubenswrapper[4964]: I1004 02:55:40.809572 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-srz4q"] Oct 04 02:55:40 crc kubenswrapper[4964]: I1004 02:55:40.811223 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-srz4q" Oct 04 02:55:40 crc kubenswrapper[4964]: I1004 02:55:40.818640 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-srz4q"] Oct 04 02:55:40 crc kubenswrapper[4964]: I1004 02:55:40.939866 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmd5g\" (UniqueName: \"kubernetes.io/projected/3268441f-8f21-41a2-a231-4791cd94f615-kube-api-access-rmd5g\") pod \"keystone-db-create-srz4q\" (UID: \"3268441f-8f21-41a2-a231-4791cd94f615\") " pod="openstack/keystone-db-create-srz4q" Oct 04 02:55:40 crc kubenswrapper[4964]: I1004 02:55:40.985489 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-vgrv6"] Oct 04 02:55:40 crc kubenswrapper[4964]: I1004 02:55:40.986441 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vgrv6" Oct 04 02:55:40 crc kubenswrapper[4964]: I1004 02:55:40.995357 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vgrv6"] Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.043446 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmd5g\" (UniqueName: \"kubernetes.io/projected/3268441f-8f21-41a2-a231-4791cd94f615-kube-api-access-rmd5g\") pod \"keystone-db-create-srz4q\" (UID: \"3268441f-8f21-41a2-a231-4791cd94f615\") " pod="openstack/keystone-db-create-srz4q" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.043573 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4mfc\" (UniqueName: \"kubernetes.io/projected/3219f838-82a7-4145-aa9c-e7dd4557d10b-kube-api-access-s4mfc\") pod \"placement-db-create-vgrv6\" (UID: \"3219f838-82a7-4145-aa9c-e7dd4557d10b\") " pod="openstack/placement-db-create-vgrv6" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.060989 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.069815 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmd5g\" (UniqueName: \"kubernetes.io/projected/3268441f-8f21-41a2-a231-4791cd94f615-kube-api-access-rmd5g\") pod \"keystone-db-create-srz4q\" (UID: \"3268441f-8f21-41a2-a231-4791cd94f615\") " pod="openstack/keystone-db-create-srz4q" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.131699 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-srz4q" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.144327 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f20df9f7-d92f-4c37-a96f-2afcdc14307c-dns-svc\") pod \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\" (UID: \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\") " Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.144386 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl58p\" (UniqueName: \"kubernetes.io/projected/f20df9f7-d92f-4c37-a96f-2afcdc14307c-kube-api-access-pl58p\") pod \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\" (UID: \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\") " Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.144456 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20df9f7-d92f-4c37-a96f-2afcdc14307c-config\") pod \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\" (UID: \"f20df9f7-d92f-4c37-a96f-2afcdc14307c\") " Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.144691 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4mfc\" (UniqueName: \"kubernetes.io/projected/3219f838-82a7-4145-aa9c-e7dd4557d10b-kube-api-access-s4mfc\") pod \"placement-db-create-vgrv6\" (UID: \"3219f838-82a7-4145-aa9c-e7dd4557d10b\") " pod="openstack/placement-db-create-vgrv6" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.149218 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f20df9f7-d92f-4c37-a96f-2afcdc14307c-kube-api-access-pl58p" (OuterVolumeSpecName: "kube-api-access-pl58p") pod "f20df9f7-d92f-4c37-a96f-2afcdc14307c" (UID: "f20df9f7-d92f-4c37-a96f-2afcdc14307c"). InnerVolumeSpecName "kube-api-access-pl58p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.161908 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4mfc\" (UniqueName: \"kubernetes.io/projected/3219f838-82a7-4145-aa9c-e7dd4557d10b-kube-api-access-s4mfc\") pod \"placement-db-create-vgrv6\" (UID: \"3219f838-82a7-4145-aa9c-e7dd4557d10b\") " pod="openstack/placement-db-create-vgrv6" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.195336 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f20df9f7-d92f-4c37-a96f-2afcdc14307c-config" (OuterVolumeSpecName: "config") pod "f20df9f7-d92f-4c37-a96f-2afcdc14307c" (UID: "f20df9f7-d92f-4c37-a96f-2afcdc14307c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.204971 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f20df9f7-d92f-4c37-a96f-2afcdc14307c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f20df9f7-d92f-4c37-a96f-2afcdc14307c" (UID: "f20df9f7-d92f-4c37-a96f-2afcdc14307c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.215647 4964 generic.go:334] "Generic (PLEG): container finished" podID="f20df9f7-d92f-4c37-a96f-2afcdc14307c" containerID="bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6" exitCode=0 Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.215702 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" event={"ID":"f20df9f7-d92f-4c37-a96f-2afcdc14307c","Type":"ContainerDied","Data":"bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6"} Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.215736 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" event={"ID":"f20df9f7-d92f-4c37-a96f-2afcdc14307c","Type":"ContainerDied","Data":"2046cdd8bb7f7c10338a6c97dda7a11e164549e8ed48087c1182018e0083c6f2"} Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.215756 4964 scope.go:117] "RemoveContainer" containerID="bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.215933 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wrd88" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.246745 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f20df9f7-d92f-4c37-a96f-2afcdc14307c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.246786 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl58p\" (UniqueName: \"kubernetes.io/projected/f20df9f7-d92f-4c37-a96f-2afcdc14307c-kube-api-access-pl58p\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.246801 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20df9f7-d92f-4c37-a96f-2afcdc14307c-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.265108 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wrd88"] Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.271967 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wrd88"] Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.282420 4964 scope.go:117] "RemoveContainer" containerID="0c98b598f99716598a23610f9c7687bc3d970191911fc235c6eb420653fbb66d" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.305319 4964 scope.go:117] "RemoveContainer" containerID="bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6" Oct 04 02:55:41 crc kubenswrapper[4964]: E1004 02:55:41.305751 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6\": container with ID starting with bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6 not found: ID does not exist" containerID="bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.305795 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6"} err="failed to get container status \"bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6\": rpc error: code = NotFound desc = could not find container \"bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6\": container with ID starting with bb0e4197a319ada83addebfb99bc534d45d1452c5a72f868412b4f4f4a2208f6 not found: ID does not exist" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.305823 4964 scope.go:117] "RemoveContainer" containerID="0c98b598f99716598a23610f9c7687bc3d970191911fc235c6eb420653fbb66d" Oct 04 02:55:41 crc kubenswrapper[4964]: E1004 02:55:41.306190 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c98b598f99716598a23610f9c7687bc3d970191911fc235c6eb420653fbb66d\": container with ID starting with 0c98b598f99716598a23610f9c7687bc3d970191911fc235c6eb420653fbb66d not found: ID does not exist" containerID="0c98b598f99716598a23610f9c7687bc3d970191911fc235c6eb420653fbb66d" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.306214 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c98b598f99716598a23610f9c7687bc3d970191911fc235c6eb420653fbb66d"} err="failed to get container status \"0c98b598f99716598a23610f9c7687bc3d970191911fc235c6eb420653fbb66d\": rpc error: code = NotFound desc = could not find container \"0c98b598f99716598a23610f9c7687bc3d970191911fc235c6eb420653fbb66d\": container with ID starting with 0c98b598f99716598a23610f9c7687bc3d970191911fc235c6eb420653fbb66d not found: ID does not exist" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.355746 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vgrv6" Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.575358 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vgrv6"] Oct 04 02:55:41 crc kubenswrapper[4964]: I1004 02:55:41.599976 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-srz4q"] Oct 04 02:55:41 crc kubenswrapper[4964]: W1004 02:55:41.604987 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3268441f_8f21_41a2_a231_4791cd94f615.slice/crio-254b7c4f51544ca8e12184865be24ef979123818b1b03ea3497f955b5700adfb WatchSource:0}: Error finding container 254b7c4f51544ca8e12184865be24ef979123818b1b03ea3497f955b5700adfb: Status 404 returned error can't find the container with id 254b7c4f51544ca8e12184865be24ef979123818b1b03ea3497f955b5700adfb Oct 04 02:55:42 crc kubenswrapper[4964]: I1004 02:55:42.226020 4964 generic.go:334] "Generic (PLEG): container finished" podID="3219f838-82a7-4145-aa9c-e7dd4557d10b" containerID="fbda6431e83369b5ab7b7104d3d38ef14505db0edfe5f0c58e080a35e0da541f" exitCode=0 Oct 04 02:55:42 crc kubenswrapper[4964]: I1004 02:55:42.226334 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vgrv6" event={"ID":"3219f838-82a7-4145-aa9c-e7dd4557d10b","Type":"ContainerDied","Data":"fbda6431e83369b5ab7b7104d3d38ef14505db0edfe5f0c58e080a35e0da541f"} Oct 04 02:55:42 crc kubenswrapper[4964]: I1004 02:55:42.226362 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vgrv6" event={"ID":"3219f838-82a7-4145-aa9c-e7dd4557d10b","Type":"ContainerStarted","Data":"32fb4d641eebe4577be61b410334071ddd0a76232088cbb2b1e484dfb1baf695"} Oct 04 02:55:42 crc kubenswrapper[4964]: I1004 02:55:42.228536 4964 generic.go:334] "Generic (PLEG): container finished" podID="3268441f-8f21-41a2-a231-4791cd94f615" containerID="73183fd374e58f891850fec52755929e6de0fb9eeb1c5f46d9b82630c248f241" exitCode=0 Oct 04 02:55:42 crc kubenswrapper[4964]: I1004 02:55:42.228608 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-srz4q" event={"ID":"3268441f-8f21-41a2-a231-4791cd94f615","Type":"ContainerDied","Data":"73183fd374e58f891850fec52755929e6de0fb9eeb1c5f46d9b82630c248f241"} Oct 04 02:55:42 crc kubenswrapper[4964]: I1004 02:55:42.228652 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-srz4q" event={"ID":"3268441f-8f21-41a2-a231-4791cd94f615","Type":"ContainerStarted","Data":"254b7c4f51544ca8e12184865be24ef979123818b1b03ea3497f955b5700adfb"} Oct 04 02:55:42 crc kubenswrapper[4964]: I1004 02:55:42.863261 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f20df9f7-d92f-4c37-a96f-2afcdc14307c" path="/var/lib/kubelet/pods/f20df9f7-d92f-4c37-a96f-2afcdc14307c/volumes" Oct 04 02:55:43 crc kubenswrapper[4964]: I1004 02:55:43.720739 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-srz4q" Oct 04 02:55:43 crc kubenswrapper[4964]: I1004 02:55:43.724234 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vgrv6" Oct 04 02:55:43 crc kubenswrapper[4964]: I1004 02:55:43.895792 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmd5g\" (UniqueName: \"kubernetes.io/projected/3268441f-8f21-41a2-a231-4791cd94f615-kube-api-access-rmd5g\") pod \"3268441f-8f21-41a2-a231-4791cd94f615\" (UID: \"3268441f-8f21-41a2-a231-4791cd94f615\") " Oct 04 02:55:43 crc kubenswrapper[4964]: I1004 02:55:43.896955 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4mfc\" (UniqueName: \"kubernetes.io/projected/3219f838-82a7-4145-aa9c-e7dd4557d10b-kube-api-access-s4mfc\") pod \"3219f838-82a7-4145-aa9c-e7dd4557d10b\" (UID: \"3219f838-82a7-4145-aa9c-e7dd4557d10b\") " Oct 04 02:55:43 crc kubenswrapper[4964]: I1004 02:55:43.901413 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3268441f-8f21-41a2-a231-4791cd94f615-kube-api-access-rmd5g" (OuterVolumeSpecName: "kube-api-access-rmd5g") pod "3268441f-8f21-41a2-a231-4791cd94f615" (UID: "3268441f-8f21-41a2-a231-4791cd94f615"). InnerVolumeSpecName "kube-api-access-rmd5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:55:43 crc kubenswrapper[4964]: I1004 02:55:43.902261 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3219f838-82a7-4145-aa9c-e7dd4557d10b-kube-api-access-s4mfc" (OuterVolumeSpecName: "kube-api-access-s4mfc") pod "3219f838-82a7-4145-aa9c-e7dd4557d10b" (UID: "3219f838-82a7-4145-aa9c-e7dd4557d10b"). InnerVolumeSpecName "kube-api-access-s4mfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:55:43 crc kubenswrapper[4964]: I1004 02:55:43.998427 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4mfc\" (UniqueName: \"kubernetes.io/projected/3219f838-82a7-4145-aa9c-e7dd4557d10b-kube-api-access-s4mfc\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:43 crc kubenswrapper[4964]: I1004 02:55:43.998456 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmd5g\" (UniqueName: \"kubernetes.io/projected/3268441f-8f21-41a2-a231-4791cd94f615-kube-api-access-rmd5g\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:44 crc kubenswrapper[4964]: I1004 02:55:44.249842 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vgrv6" event={"ID":"3219f838-82a7-4145-aa9c-e7dd4557d10b","Type":"ContainerDied","Data":"32fb4d641eebe4577be61b410334071ddd0a76232088cbb2b1e484dfb1baf695"} Oct 04 02:55:44 crc kubenswrapper[4964]: I1004 02:55:44.249898 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32fb4d641eebe4577be61b410334071ddd0a76232088cbb2b1e484dfb1baf695" Oct 04 02:55:44 crc kubenswrapper[4964]: I1004 02:55:44.249858 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vgrv6" Oct 04 02:55:44 crc kubenswrapper[4964]: I1004 02:55:44.252237 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-srz4q" event={"ID":"3268441f-8f21-41a2-a231-4791cd94f615","Type":"ContainerDied","Data":"254b7c4f51544ca8e12184865be24ef979123818b1b03ea3497f955b5700adfb"} Oct 04 02:55:44 crc kubenswrapper[4964]: I1004 02:55:44.252296 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-srz4q" Oct 04 02:55:44 crc kubenswrapper[4964]: I1004 02:55:44.252318 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="254b7c4f51544ca8e12184865be24ef979123818b1b03ea3497f955b5700adfb" Oct 04 02:55:44 crc kubenswrapper[4964]: I1004 02:55:44.792244 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.299092 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-5bxql"] Oct 04 02:55:46 crc kubenswrapper[4964]: E1004 02:55:46.299857 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20df9f7-d92f-4c37-a96f-2afcdc14307c" containerName="dnsmasq-dns" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.299879 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20df9f7-d92f-4c37-a96f-2afcdc14307c" containerName="dnsmasq-dns" Oct 04 02:55:46 crc kubenswrapper[4964]: E1004 02:55:46.299906 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3268441f-8f21-41a2-a231-4791cd94f615" containerName="mariadb-database-create" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.299918 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="3268441f-8f21-41a2-a231-4791cd94f615" containerName="mariadb-database-create" Oct 04 02:55:46 crc kubenswrapper[4964]: E1004 02:55:46.299948 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20df9f7-d92f-4c37-a96f-2afcdc14307c" containerName="init" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.299960 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20df9f7-d92f-4c37-a96f-2afcdc14307c" containerName="init" Oct 04 02:55:46 crc kubenswrapper[4964]: E1004 02:55:46.299980 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3219f838-82a7-4145-aa9c-e7dd4557d10b" containerName="mariadb-database-create" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.299993 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="3219f838-82a7-4145-aa9c-e7dd4557d10b" containerName="mariadb-database-create" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.300296 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="f20df9f7-d92f-4c37-a96f-2afcdc14307c" containerName="dnsmasq-dns" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.300348 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="3268441f-8f21-41a2-a231-4791cd94f615" containerName="mariadb-database-create" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.300370 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="3219f838-82a7-4145-aa9c-e7dd4557d10b" containerName="mariadb-database-create" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.301127 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5bxql" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.312654 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5bxql"] Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.355698 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfcv4\" (UniqueName: \"kubernetes.io/projected/2435773b-23eb-4a67-b454-4531b0c41831-kube-api-access-wfcv4\") pod \"glance-db-create-5bxql\" (UID: \"2435773b-23eb-4a67-b454-4531b0c41831\") " pod="openstack/glance-db-create-5bxql" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.457537 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfcv4\" (UniqueName: \"kubernetes.io/projected/2435773b-23eb-4a67-b454-4531b0c41831-kube-api-access-wfcv4\") pod \"glance-db-create-5bxql\" (UID: \"2435773b-23eb-4a67-b454-4531b0c41831\") " pod="openstack/glance-db-create-5bxql" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.481823 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfcv4\" (UniqueName: \"kubernetes.io/projected/2435773b-23eb-4a67-b454-4531b0c41831-kube-api-access-wfcv4\") pod \"glance-db-create-5bxql\" (UID: \"2435773b-23eb-4a67-b454-4531b0c41831\") " pod="openstack/glance-db-create-5bxql" Oct 04 02:55:46 crc kubenswrapper[4964]: I1004 02:55:46.626844 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5bxql" Oct 04 02:55:47 crc kubenswrapper[4964]: I1004 02:55:47.195723 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5bxql"] Oct 04 02:55:47 crc kubenswrapper[4964]: W1004 02:55:47.199435 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2435773b_23eb_4a67_b454_4531b0c41831.slice/crio-d5ffdcf2272a75343a0103f4703695ad9dccc71476b0b76fc2a44b430657ead0 WatchSource:0}: Error finding container d5ffdcf2272a75343a0103f4703695ad9dccc71476b0b76fc2a44b430657ead0: Status 404 returned error can't find the container with id d5ffdcf2272a75343a0103f4703695ad9dccc71476b0b76fc2a44b430657ead0 Oct 04 02:55:47 crc kubenswrapper[4964]: I1004 02:55:47.283530 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5bxql" event={"ID":"2435773b-23eb-4a67-b454-4531b0c41831","Type":"ContainerStarted","Data":"d5ffdcf2272a75343a0103f4703695ad9dccc71476b0b76fc2a44b430657ead0"} Oct 04 02:55:48 crc kubenswrapper[4964]: I1004 02:55:48.292197 4964 generic.go:334] "Generic (PLEG): container finished" podID="2435773b-23eb-4a67-b454-4531b0c41831" containerID="24b93231cc2d64dccaf39d13f08f312ae902afdec4c74a08d7d2d545c575c86c" exitCode=0 Oct 04 02:55:48 crc kubenswrapper[4964]: I1004 02:55:48.292249 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5bxql" event={"ID":"2435773b-23eb-4a67-b454-4531b0c41831","Type":"ContainerDied","Data":"24b93231cc2d64dccaf39d13f08f312ae902afdec4c74a08d7d2d545c575c86c"} Oct 04 02:55:49 crc kubenswrapper[4964]: I1004 02:55:49.720741 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5bxql" Oct 04 02:55:49 crc kubenswrapper[4964]: I1004 02:55:49.825034 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfcv4\" (UniqueName: \"kubernetes.io/projected/2435773b-23eb-4a67-b454-4531b0c41831-kube-api-access-wfcv4\") pod \"2435773b-23eb-4a67-b454-4531b0c41831\" (UID: \"2435773b-23eb-4a67-b454-4531b0c41831\") " Oct 04 02:55:49 crc kubenswrapper[4964]: I1004 02:55:49.832604 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2435773b-23eb-4a67-b454-4531b0c41831-kube-api-access-wfcv4" (OuterVolumeSpecName: "kube-api-access-wfcv4") pod "2435773b-23eb-4a67-b454-4531b0c41831" (UID: "2435773b-23eb-4a67-b454-4531b0c41831"). InnerVolumeSpecName "kube-api-access-wfcv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:55:49 crc kubenswrapper[4964]: I1004 02:55:49.927586 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfcv4\" (UniqueName: \"kubernetes.io/projected/2435773b-23eb-4a67-b454-4531b0c41831-kube-api-access-wfcv4\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:50 crc kubenswrapper[4964]: I1004 02:55:50.316799 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5bxql" event={"ID":"2435773b-23eb-4a67-b454-4531b0c41831","Type":"ContainerDied","Data":"d5ffdcf2272a75343a0103f4703695ad9dccc71476b0b76fc2a44b430657ead0"} Oct 04 02:55:50 crc kubenswrapper[4964]: I1004 02:55:50.316844 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5ffdcf2272a75343a0103f4703695ad9dccc71476b0b76fc2a44b430657ead0" Oct 04 02:55:50 crc kubenswrapper[4964]: I1004 02:55:50.316884 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5bxql" Oct 04 02:55:50 crc kubenswrapper[4964]: I1004 02:55:50.824607 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ee08-account-create-frb92"] Oct 04 02:55:50 crc kubenswrapper[4964]: E1004 02:55:50.824922 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2435773b-23eb-4a67-b454-4531b0c41831" containerName="mariadb-database-create" Oct 04 02:55:50 crc kubenswrapper[4964]: I1004 02:55:50.824935 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="2435773b-23eb-4a67-b454-4531b0c41831" containerName="mariadb-database-create" Oct 04 02:55:50 crc kubenswrapper[4964]: I1004 02:55:50.825107 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="2435773b-23eb-4a67-b454-4531b0c41831" containerName="mariadb-database-create" Oct 04 02:55:50 crc kubenswrapper[4964]: I1004 02:55:50.825590 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ee08-account-create-frb92" Oct 04 02:55:50 crc kubenswrapper[4964]: I1004 02:55:50.827393 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 04 02:55:50 crc kubenswrapper[4964]: I1004 02:55:50.834866 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ee08-account-create-frb92"] Oct 04 02:55:50 crc kubenswrapper[4964]: I1004 02:55:50.946759 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqhxq\" (UniqueName: \"kubernetes.io/projected/5035eec8-0929-4248-a007-d4dda1330e10-kube-api-access-dqhxq\") pod \"keystone-ee08-account-create-frb92\" (UID: \"5035eec8-0929-4248-a007-d4dda1330e10\") " pod="openstack/keystone-ee08-account-create-frb92" Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.049096 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqhxq\" (UniqueName: \"kubernetes.io/projected/5035eec8-0929-4248-a007-d4dda1330e10-kube-api-access-dqhxq\") pod \"keystone-ee08-account-create-frb92\" (UID: \"5035eec8-0929-4248-a007-d4dda1330e10\") " pod="openstack/keystone-ee08-account-create-frb92" Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.079951 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqhxq\" (UniqueName: \"kubernetes.io/projected/5035eec8-0929-4248-a007-d4dda1330e10-kube-api-access-dqhxq\") pod \"keystone-ee08-account-create-frb92\" (UID: \"5035eec8-0929-4248-a007-d4dda1330e10\") " pod="openstack/keystone-ee08-account-create-frb92" Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.143868 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ee08-account-create-frb92" Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.166931 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b6de-account-create-qzzbp"] Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.168156 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b6de-account-create-qzzbp" Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.171232 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.181396 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b6de-account-create-qzzbp"] Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.252265 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2srfg\" (UniqueName: \"kubernetes.io/projected/eae45e80-cbad-42ac-a8a3-1141611d5f2d-kube-api-access-2srfg\") pod \"placement-b6de-account-create-qzzbp\" (UID: \"eae45e80-cbad-42ac-a8a3-1141611d5f2d\") " pod="openstack/placement-b6de-account-create-qzzbp" Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.353067 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2srfg\" (UniqueName: \"kubernetes.io/projected/eae45e80-cbad-42ac-a8a3-1141611d5f2d-kube-api-access-2srfg\") pod \"placement-b6de-account-create-qzzbp\" (UID: \"eae45e80-cbad-42ac-a8a3-1141611d5f2d\") " pod="openstack/placement-b6de-account-create-qzzbp" Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.376387 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2srfg\" (UniqueName: \"kubernetes.io/projected/eae45e80-cbad-42ac-a8a3-1141611d5f2d-kube-api-access-2srfg\") pod \"placement-b6de-account-create-qzzbp\" (UID: \"eae45e80-cbad-42ac-a8a3-1141611d5f2d\") " pod="openstack/placement-b6de-account-create-qzzbp" Oct 04 02:55:51 crc kubenswrapper[4964]: W1004 02:55:51.468541 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5035eec8_0929_4248_a007_d4dda1330e10.slice/crio-a89c70839885c812ff645a3e4e8aaeb69ddf27e63d0f6e21b79530a744fed05b WatchSource:0}: Error finding container a89c70839885c812ff645a3e4e8aaeb69ddf27e63d0f6e21b79530a744fed05b: Status 404 returned error can't find the container with id a89c70839885c812ff645a3e4e8aaeb69ddf27e63d0f6e21b79530a744fed05b Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.473294 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ee08-account-create-frb92"] Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.558470 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b6de-account-create-qzzbp" Oct 04 02:55:51 crc kubenswrapper[4964]: I1004 02:55:51.843329 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b6de-account-create-qzzbp"] Oct 04 02:55:51 crc kubenswrapper[4964]: W1004 02:55:51.846672 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeae45e80_cbad_42ac_a8a3_1141611d5f2d.slice/crio-5394b9632dbb1e05bddea0655a51151ade57a9ca36d8df4d0d4744ab6d914fb1 WatchSource:0}: Error finding container 5394b9632dbb1e05bddea0655a51151ade57a9ca36d8df4d0d4744ab6d914fb1: Status 404 returned error can't find the container with id 5394b9632dbb1e05bddea0655a51151ade57a9ca36d8df4d0d4744ab6d914fb1 Oct 04 02:55:52 crc kubenswrapper[4964]: I1004 02:55:52.342970 4964 generic.go:334] "Generic (PLEG): container finished" podID="5035eec8-0929-4248-a007-d4dda1330e10" containerID="ed6e40cfb81a9abf65b179922e645cd4b43fd20d669c83bc2a3a2ba42bc33617" exitCode=0 Oct 04 02:55:52 crc kubenswrapper[4964]: I1004 02:55:52.343063 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ee08-account-create-frb92" event={"ID":"5035eec8-0929-4248-a007-d4dda1330e10","Type":"ContainerDied","Data":"ed6e40cfb81a9abf65b179922e645cd4b43fd20d669c83bc2a3a2ba42bc33617"} Oct 04 02:55:52 crc kubenswrapper[4964]: I1004 02:55:52.343103 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ee08-account-create-frb92" event={"ID":"5035eec8-0929-4248-a007-d4dda1330e10","Type":"ContainerStarted","Data":"a89c70839885c812ff645a3e4e8aaeb69ddf27e63d0f6e21b79530a744fed05b"} Oct 04 02:55:52 crc kubenswrapper[4964]: I1004 02:55:52.345718 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b6de-account-create-qzzbp" event={"ID":"eae45e80-cbad-42ac-a8a3-1141611d5f2d","Type":"ContainerStarted","Data":"5394b9632dbb1e05bddea0655a51151ade57a9ca36d8df4d0d4744ab6d914fb1"} Oct 04 02:55:53 crc kubenswrapper[4964]: I1004 02:55:53.060034 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-c6kng" podUID="b37ef67d-6614-4b44-9435-a35a4939caf7" containerName="ovn-controller" probeResult="failure" output=< Oct 04 02:55:53 crc kubenswrapper[4964]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 04 02:55:53 crc kubenswrapper[4964]: > Oct 04 02:55:53 crc kubenswrapper[4964]: I1004 02:55:53.073841 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:53 crc kubenswrapper[4964]: I1004 02:55:53.361340 4964 generic.go:334] "Generic (PLEG): container finished" podID="eae45e80-cbad-42ac-a8a3-1141611d5f2d" containerID="c76481151603b19eeb7778b8cf149fbbaa539806023499d9af1dfdb1118986d7" exitCode=0 Oct 04 02:55:53 crc kubenswrapper[4964]: I1004 02:55:53.361430 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b6de-account-create-qzzbp" event={"ID":"eae45e80-cbad-42ac-a8a3-1141611d5f2d","Type":"ContainerDied","Data":"c76481151603b19eeb7778b8cf149fbbaa539806023499d9af1dfdb1118986d7"} Oct 04 02:55:53 crc kubenswrapper[4964]: I1004 02:55:53.784345 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ee08-account-create-frb92" Oct 04 02:55:53 crc kubenswrapper[4964]: I1004 02:55:53.796501 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqhxq\" (UniqueName: \"kubernetes.io/projected/5035eec8-0929-4248-a007-d4dda1330e10-kube-api-access-dqhxq\") pod \"5035eec8-0929-4248-a007-d4dda1330e10\" (UID: \"5035eec8-0929-4248-a007-d4dda1330e10\") " Oct 04 02:55:53 crc kubenswrapper[4964]: I1004 02:55:53.838823 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5035eec8-0929-4248-a007-d4dda1330e10-kube-api-access-dqhxq" (OuterVolumeSpecName: "kube-api-access-dqhxq") pod "5035eec8-0929-4248-a007-d4dda1330e10" (UID: "5035eec8-0929-4248-a007-d4dda1330e10"). InnerVolumeSpecName "kube-api-access-dqhxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:55:53 crc kubenswrapper[4964]: I1004 02:55:53.897969 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqhxq\" (UniqueName: \"kubernetes.io/projected/5035eec8-0929-4248-a007-d4dda1330e10-kube-api-access-dqhxq\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:54 crc kubenswrapper[4964]: I1004 02:55:54.376771 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ee08-account-create-frb92" Oct 04 02:55:54 crc kubenswrapper[4964]: I1004 02:55:54.376801 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ee08-account-create-frb92" event={"ID":"5035eec8-0929-4248-a007-d4dda1330e10","Type":"ContainerDied","Data":"a89c70839885c812ff645a3e4e8aaeb69ddf27e63d0f6e21b79530a744fed05b"} Oct 04 02:55:54 crc kubenswrapper[4964]: I1004 02:55:54.377332 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a89c70839885c812ff645a3e4e8aaeb69ddf27e63d0f6e21b79530a744fed05b" Oct 04 02:55:54 crc kubenswrapper[4964]: I1004 02:55:54.801776 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b6de-account-create-qzzbp" Oct 04 02:55:54 crc kubenswrapper[4964]: I1004 02:55:54.813715 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2srfg\" (UniqueName: \"kubernetes.io/projected/eae45e80-cbad-42ac-a8a3-1141611d5f2d-kube-api-access-2srfg\") pod \"eae45e80-cbad-42ac-a8a3-1141611d5f2d\" (UID: \"eae45e80-cbad-42ac-a8a3-1141611d5f2d\") " Oct 04 02:55:54 crc kubenswrapper[4964]: I1004 02:55:54.826583 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae45e80-cbad-42ac-a8a3-1141611d5f2d-kube-api-access-2srfg" (OuterVolumeSpecName: "kube-api-access-2srfg") pod "eae45e80-cbad-42ac-a8a3-1141611d5f2d" (UID: "eae45e80-cbad-42ac-a8a3-1141611d5f2d"). InnerVolumeSpecName "kube-api-access-2srfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:55:54 crc kubenswrapper[4964]: I1004 02:55:54.916975 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2srfg\" (UniqueName: \"kubernetes.io/projected/eae45e80-cbad-42ac-a8a3-1141611d5f2d-kube-api-access-2srfg\") on node \"crc\" DevicePath \"\"" Oct 04 02:55:55 crc kubenswrapper[4964]: I1004 02:55:55.390256 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b6de-account-create-qzzbp" event={"ID":"eae45e80-cbad-42ac-a8a3-1141611d5f2d","Type":"ContainerDied","Data":"5394b9632dbb1e05bddea0655a51151ade57a9ca36d8df4d0d4744ab6d914fb1"} Oct 04 02:55:55 crc kubenswrapper[4964]: I1004 02:55:55.391648 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5394b9632dbb1e05bddea0655a51151ade57a9ca36d8df4d0d4744ab6d914fb1" Oct 04 02:55:55 crc kubenswrapper[4964]: I1004 02:55:55.390352 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b6de-account-create-qzzbp" Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.401426 4964 generic.go:334] "Generic (PLEG): container finished" podID="58ea849f-c48c-473c-8608-694d254c47cf" containerID="a520d0e8af5bf9aed07472b58ea6c9fdd00d5af00a084474f9d20db6d7bc2e60" exitCode=0 Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.401492 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"58ea849f-c48c-473c-8608-694d254c47cf","Type":"ContainerDied","Data":"a520d0e8af5bf9aed07472b58ea6c9fdd00d5af00a084474f9d20db6d7bc2e60"} Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.502323 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1840-account-create-hnvtp"] Oct 04 02:55:56 crc kubenswrapper[4964]: E1004 02:55:56.502999 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5035eec8-0929-4248-a007-d4dda1330e10" containerName="mariadb-account-create" Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.503016 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5035eec8-0929-4248-a007-d4dda1330e10" containerName="mariadb-account-create" Oct 04 02:55:56 crc kubenswrapper[4964]: E1004 02:55:56.503033 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae45e80-cbad-42ac-a8a3-1141611d5f2d" containerName="mariadb-account-create" Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.503050 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae45e80-cbad-42ac-a8a3-1141611d5f2d" containerName="mariadb-account-create" Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.503196 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae45e80-cbad-42ac-a8a3-1141611d5f2d" containerName="mariadb-account-create" Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.503221 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="5035eec8-0929-4248-a007-d4dda1330e10" containerName="mariadb-account-create" Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.503814 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1840-account-create-hnvtp" Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.506727 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.518382 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1840-account-create-hnvtp"] Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.549902 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b6tb\" (UniqueName: \"kubernetes.io/projected/46f231c3-73a1-4152-b618-ba3ac0e2b7f3-kube-api-access-9b6tb\") pod \"glance-1840-account-create-hnvtp\" (UID: \"46f231c3-73a1-4152-b618-ba3ac0e2b7f3\") " pod="openstack/glance-1840-account-create-hnvtp" Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.651561 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b6tb\" (UniqueName: \"kubernetes.io/projected/46f231c3-73a1-4152-b618-ba3ac0e2b7f3-kube-api-access-9b6tb\") pod \"glance-1840-account-create-hnvtp\" (UID: \"46f231c3-73a1-4152-b618-ba3ac0e2b7f3\") " pod="openstack/glance-1840-account-create-hnvtp" Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.673258 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b6tb\" (UniqueName: \"kubernetes.io/projected/46f231c3-73a1-4152-b618-ba3ac0e2b7f3-kube-api-access-9b6tb\") pod \"glance-1840-account-create-hnvtp\" (UID: \"46f231c3-73a1-4152-b618-ba3ac0e2b7f3\") " pod="openstack/glance-1840-account-create-hnvtp" Oct 04 02:55:56 crc kubenswrapper[4964]: I1004 02:55:56.929925 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1840-account-create-hnvtp" Oct 04 02:55:57 crc kubenswrapper[4964]: I1004 02:55:57.413257 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"58ea849f-c48c-473c-8608-694d254c47cf","Type":"ContainerStarted","Data":"6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e"} Oct 04 02:55:57 crc kubenswrapper[4964]: I1004 02:55:57.413937 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:55:57 crc kubenswrapper[4964]: I1004 02:55:57.415196 4964 generic.go:334] "Generic (PLEG): container finished" podID="e87e52ad-66be-448b-b575-6d0acd8a8d4e" containerID="08db805bf8b88d305b35ca233a5a28aab7ae81c637374c238029a67c96628722" exitCode=0 Oct 04 02:55:57 crc kubenswrapper[4964]: I1004 02:55:57.415241 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e87e52ad-66be-448b-b575-6d0acd8a8d4e","Type":"ContainerDied","Data":"08db805bf8b88d305b35ca233a5a28aab7ae81c637374c238029a67c96628722"} Oct 04 02:55:57 crc kubenswrapper[4964]: I1004 02:55:57.487062 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.947525961 podStartE2EDuration="1m1.487039389s" podCreationTimestamp="2025-10-04 02:54:56 +0000 UTC" firstStartedPulling="2025-10-04 02:55:10.259280796 +0000 UTC m=+890.156239434" lastFinishedPulling="2025-10-04 02:55:17.798794184 +0000 UTC m=+897.695752862" observedRunningTime="2025-10-04 02:55:57.475786976 +0000 UTC m=+937.372745644" watchObservedRunningTime="2025-10-04 02:55:57.487039389 +0000 UTC m=+937.383998057" Oct 04 02:55:57 crc kubenswrapper[4964]: I1004 02:55:57.530179 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1840-account-create-hnvtp"] Oct 04 02:55:57 crc kubenswrapper[4964]: W1004 02:55:57.533689 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46f231c3_73a1_4152_b618_ba3ac0e2b7f3.slice/crio-66be88845d9482dc746de9be3c7cd040cd0da4cb8b776202c5600f4d82613dec WatchSource:0}: Error finding container 66be88845d9482dc746de9be3c7cd040cd0da4cb8b776202c5600f4d82613dec: Status 404 returned error can't find the container with id 66be88845d9482dc746de9be3c7cd040cd0da4cb8b776202c5600f4d82613dec Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.043215 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-c6kng" podUID="b37ef67d-6614-4b44-9435-a35a4939caf7" containerName="ovn-controller" probeResult="failure" output=< Oct 04 02:55:58 crc kubenswrapper[4964]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 04 02:55:58 crc kubenswrapper[4964]: > Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.053267 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fj9vk" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.258678 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-c6kng-config-jv5pm"] Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.259627 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.261938 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.290169 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c6kng-config-jv5pm"] Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.380690 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-run\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.380829 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-run-ovn\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.380877 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a94ed339-8c96-46a9-be8b-7008b9c1385d-scripts\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.380921 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a94ed339-8c96-46a9-be8b-7008b9c1385d-additional-scripts\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.380956 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlp6t\" (UniqueName: \"kubernetes.io/projected/a94ed339-8c96-46a9-be8b-7008b9c1385d-kube-api-access-rlp6t\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.381064 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-log-ovn\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.423430 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e87e52ad-66be-448b-b575-6d0acd8a8d4e","Type":"ContainerStarted","Data":"7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c"} Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.424305 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.425963 4964 generic.go:334] "Generic (PLEG): container finished" podID="46f231c3-73a1-4152-b618-ba3ac0e2b7f3" containerID="ca8c25952e5ed21fcf9a509892af37cbd5efca74f8a26a776ef9957e78998dbe" exitCode=0 Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.426423 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1840-account-create-hnvtp" event={"ID":"46f231c3-73a1-4152-b618-ba3ac0e2b7f3","Type":"ContainerDied","Data":"ca8c25952e5ed21fcf9a509892af37cbd5efca74f8a26a776ef9957e78998dbe"} Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.426460 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1840-account-create-hnvtp" event={"ID":"46f231c3-73a1-4152-b618-ba3ac0e2b7f3","Type":"ContainerStarted","Data":"66be88845d9482dc746de9be3c7cd040cd0da4cb8b776202c5600f4d82613dec"} Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.464884 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=55.922966803 podStartE2EDuration="1m2.464847652s" podCreationTimestamp="2025-10-04 02:54:56 +0000 UTC" firstStartedPulling="2025-10-04 02:55:10.286493553 +0000 UTC m=+890.183452191" lastFinishedPulling="2025-10-04 02:55:16.828374382 +0000 UTC m=+896.725333040" observedRunningTime="2025-10-04 02:55:58.446685073 +0000 UTC m=+938.343643731" watchObservedRunningTime="2025-10-04 02:55:58.464847652 +0000 UTC m=+938.361806280" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.482609 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-run-ovn\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.482678 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a94ed339-8c96-46a9-be8b-7008b9c1385d-scripts\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.482708 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a94ed339-8c96-46a9-be8b-7008b9c1385d-additional-scripts\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.482730 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlp6t\" (UniqueName: \"kubernetes.io/projected/a94ed339-8c96-46a9-be8b-7008b9c1385d-kube-api-access-rlp6t\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.482756 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-log-ovn\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.482818 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-run\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.482872 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-run-ovn\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.482929 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-run\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.482978 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-log-ovn\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.483557 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a94ed339-8c96-46a9-be8b-7008b9c1385d-additional-scripts\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.485114 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a94ed339-8c96-46a9-be8b-7008b9c1385d-scripts\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.505165 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlp6t\" (UniqueName: \"kubernetes.io/projected/a94ed339-8c96-46a9-be8b-7008b9c1385d-kube-api-access-rlp6t\") pod \"ovn-controller-c6kng-config-jv5pm\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:58 crc kubenswrapper[4964]: I1004 02:55:58.583294 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:55:59 crc kubenswrapper[4964]: W1004 02:55:59.072345 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda94ed339_8c96_46a9_be8b_7008b9c1385d.slice/crio-b3bd8604f52e1e7ef7bedd5761008d39b4d90c30b2158eda5565c7e4f70efdcd WatchSource:0}: Error finding container b3bd8604f52e1e7ef7bedd5761008d39b4d90c30b2158eda5565c7e4f70efdcd: Status 404 returned error can't find the container with id b3bd8604f52e1e7ef7bedd5761008d39b4d90c30b2158eda5565c7e4f70efdcd Oct 04 02:55:59 crc kubenswrapper[4964]: I1004 02:55:59.073400 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c6kng-config-jv5pm"] Oct 04 02:55:59 crc kubenswrapper[4964]: I1004 02:55:59.436937 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6kng-config-jv5pm" event={"ID":"a94ed339-8c96-46a9-be8b-7008b9c1385d","Type":"ContainerStarted","Data":"4de5a7142f94763e41aa722d6cd5f5f2dc190940355fb1556cf9e5ad259c9b29"} Oct 04 02:55:59 crc kubenswrapper[4964]: I1004 02:55:59.437424 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6kng-config-jv5pm" event={"ID":"a94ed339-8c96-46a9-be8b-7008b9c1385d","Type":"ContainerStarted","Data":"b3bd8604f52e1e7ef7bedd5761008d39b4d90c30b2158eda5565c7e4f70efdcd"} Oct 04 02:55:59 crc kubenswrapper[4964]: I1004 02:55:59.461594 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-c6kng-config-jv5pm" podStartSLOduration=1.461580895 podStartE2EDuration="1.461580895s" podCreationTimestamp="2025-10-04 02:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:55:59.45840878 +0000 UTC m=+939.355367418" watchObservedRunningTime="2025-10-04 02:55:59.461580895 +0000 UTC m=+939.358539533" Oct 04 02:55:59 crc kubenswrapper[4964]: I1004 02:55:59.755014 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1840-account-create-hnvtp" Oct 04 02:55:59 crc kubenswrapper[4964]: I1004 02:55:59.903754 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b6tb\" (UniqueName: \"kubernetes.io/projected/46f231c3-73a1-4152-b618-ba3ac0e2b7f3-kube-api-access-9b6tb\") pod \"46f231c3-73a1-4152-b618-ba3ac0e2b7f3\" (UID: \"46f231c3-73a1-4152-b618-ba3ac0e2b7f3\") " Oct 04 02:55:59 crc kubenswrapper[4964]: I1004 02:55:59.918520 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46f231c3-73a1-4152-b618-ba3ac0e2b7f3-kube-api-access-9b6tb" (OuterVolumeSpecName: "kube-api-access-9b6tb") pod "46f231c3-73a1-4152-b618-ba3ac0e2b7f3" (UID: "46f231c3-73a1-4152-b618-ba3ac0e2b7f3"). InnerVolumeSpecName "kube-api-access-9b6tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:00 crc kubenswrapper[4964]: I1004 02:56:00.006293 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b6tb\" (UniqueName: \"kubernetes.io/projected/46f231c3-73a1-4152-b618-ba3ac0e2b7f3-kube-api-access-9b6tb\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:00 crc kubenswrapper[4964]: I1004 02:56:00.450081 4964 generic.go:334] "Generic (PLEG): container finished" podID="a94ed339-8c96-46a9-be8b-7008b9c1385d" containerID="4de5a7142f94763e41aa722d6cd5f5f2dc190940355fb1556cf9e5ad259c9b29" exitCode=0 Oct 04 02:56:00 crc kubenswrapper[4964]: I1004 02:56:00.450187 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6kng-config-jv5pm" event={"ID":"a94ed339-8c96-46a9-be8b-7008b9c1385d","Type":"ContainerDied","Data":"4de5a7142f94763e41aa722d6cd5f5f2dc190940355fb1556cf9e5ad259c9b29"} Oct 04 02:56:00 crc kubenswrapper[4964]: I1004 02:56:00.454135 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1840-account-create-hnvtp" event={"ID":"46f231c3-73a1-4152-b618-ba3ac0e2b7f3","Type":"ContainerDied","Data":"66be88845d9482dc746de9be3c7cd040cd0da4cb8b776202c5600f4d82613dec"} Oct 04 02:56:00 crc kubenswrapper[4964]: I1004 02:56:00.454199 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1840-account-create-hnvtp" Oct 04 02:56:00 crc kubenswrapper[4964]: I1004 02:56:00.454198 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66be88845d9482dc746de9be3c7cd040cd0da4cb8b776202c5600f4d82613dec" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.695999 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vjrgk"] Oct 04 02:56:01 crc kubenswrapper[4964]: E1004 02:56:01.697953 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f231c3-73a1-4152-b618-ba3ac0e2b7f3" containerName="mariadb-account-create" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.697984 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f231c3-73a1-4152-b618-ba3ac0e2b7f3" containerName="mariadb-account-create" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.698161 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f231c3-73a1-4152-b618-ba3ac0e2b7f3" containerName="mariadb-account-create" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.698700 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.701046 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rkxfr" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.707344 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vjrgk"] Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.708764 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.835589 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-db-sync-config-data\") pod \"glance-db-sync-vjrgk\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.835692 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-config-data\") pod \"glance-db-sync-vjrgk\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.835740 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-combined-ca-bundle\") pod \"glance-db-sync-vjrgk\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.835773 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l5bh\" (UniqueName: \"kubernetes.io/projected/f1021ace-0fb9-45a8-b83b-12a487b37bf3-kube-api-access-7l5bh\") pod \"glance-db-sync-vjrgk\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.860429 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.936369 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a94ed339-8c96-46a9-be8b-7008b9c1385d-scripts\") pod \"a94ed339-8c96-46a9-be8b-7008b9c1385d\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.936408 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-run-ovn\") pod \"a94ed339-8c96-46a9-be8b-7008b9c1385d\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.936483 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-run\") pod \"a94ed339-8c96-46a9-be8b-7008b9c1385d\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.936510 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlp6t\" (UniqueName: \"kubernetes.io/projected/a94ed339-8c96-46a9-be8b-7008b9c1385d-kube-api-access-rlp6t\") pod \"a94ed339-8c96-46a9-be8b-7008b9c1385d\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.936557 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-log-ovn\") pod \"a94ed339-8c96-46a9-be8b-7008b9c1385d\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.936609 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a94ed339-8c96-46a9-be8b-7008b9c1385d-additional-scripts\") pod \"a94ed339-8c96-46a9-be8b-7008b9c1385d\" (UID: \"a94ed339-8c96-46a9-be8b-7008b9c1385d\") " Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.936822 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-config-data\") pod \"glance-db-sync-vjrgk\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.936869 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-combined-ca-bundle\") pod \"glance-db-sync-vjrgk\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.936907 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l5bh\" (UniqueName: \"kubernetes.io/projected/f1021ace-0fb9-45a8-b83b-12a487b37bf3-kube-api-access-7l5bh\") pod \"glance-db-sync-vjrgk\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.936939 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-db-sync-config-data\") pod \"glance-db-sync-vjrgk\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.937847 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a94ed339-8c96-46a9-be8b-7008b9c1385d" (UID: "a94ed339-8c96-46a9-be8b-7008b9c1385d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.937883 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-run" (OuterVolumeSpecName: "var-run") pod "a94ed339-8c96-46a9-be8b-7008b9c1385d" (UID: "a94ed339-8c96-46a9-be8b-7008b9c1385d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.937918 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a94ed339-8c96-46a9-be8b-7008b9c1385d" (UID: "a94ed339-8c96-46a9-be8b-7008b9c1385d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.938726 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a94ed339-8c96-46a9-be8b-7008b9c1385d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a94ed339-8c96-46a9-be8b-7008b9c1385d" (UID: "a94ed339-8c96-46a9-be8b-7008b9c1385d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.939027 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a94ed339-8c96-46a9-be8b-7008b9c1385d-scripts" (OuterVolumeSpecName: "scripts") pod "a94ed339-8c96-46a9-be8b-7008b9c1385d" (UID: "a94ed339-8c96-46a9-be8b-7008b9c1385d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.942354 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94ed339-8c96-46a9-be8b-7008b9c1385d-kube-api-access-rlp6t" (OuterVolumeSpecName: "kube-api-access-rlp6t") pod "a94ed339-8c96-46a9-be8b-7008b9c1385d" (UID: "a94ed339-8c96-46a9-be8b-7008b9c1385d"). InnerVolumeSpecName "kube-api-access-rlp6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.942976 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-db-sync-config-data\") pod \"glance-db-sync-vjrgk\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.944944 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-combined-ca-bundle\") pod \"glance-db-sync-vjrgk\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.954999 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-config-data\") pod \"glance-db-sync-vjrgk\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:01 crc kubenswrapper[4964]: I1004 02:56:01.955599 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l5bh\" (UniqueName: \"kubernetes.io/projected/f1021ace-0fb9-45a8-b83b-12a487b37bf3-kube-api-access-7l5bh\") pod \"glance-db-sync-vjrgk\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.037902 4964 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a94ed339-8c96-46a9-be8b-7008b9c1385d-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.037936 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a94ed339-8c96-46a9-be8b-7008b9c1385d-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.037947 4964 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.037956 4964 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-run\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.037964 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlp6t\" (UniqueName: \"kubernetes.io/projected/a94ed339-8c96-46a9-be8b-7008b9c1385d-kube-api-access-rlp6t\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.037975 4964 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a94ed339-8c96-46a9-be8b-7008b9c1385d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.043882 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.474091 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6kng-config-jv5pm" event={"ID":"a94ed339-8c96-46a9-be8b-7008b9c1385d","Type":"ContainerDied","Data":"b3bd8604f52e1e7ef7bedd5761008d39b4d90c30b2158eda5565c7e4f70efdcd"} Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.474133 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3bd8604f52e1e7ef7bedd5761008d39b4d90c30b2158eda5565c7e4f70efdcd" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.474200 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6kng-config-jv5pm" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.581992 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-c6kng-config-jv5pm"] Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.606538 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-c6kng-config-jv5pm"] Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.630195 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vjrgk"] Oct 04 02:56:02 crc kubenswrapper[4964]: W1004 02:56:02.648517 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1021ace_0fb9_45a8_b83b_12a487b37bf3.slice/crio-ab9f5ea98e3d12b6a4b91a44a1e4e0ff87930559f5fa92bf15cf836f4e6d8e82 WatchSource:0}: Error finding container ab9f5ea98e3d12b6a4b91a44a1e4e0ff87930559f5fa92bf15cf836f4e6d8e82: Status 404 returned error can't find the container with id ab9f5ea98e3d12b6a4b91a44a1e4e0ff87930559f5fa92bf15cf836f4e6d8e82 Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.711721 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-c6kng-config-cfxt5"] Oct 04 02:56:02 crc kubenswrapper[4964]: E1004 02:56:02.712025 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94ed339-8c96-46a9-be8b-7008b9c1385d" containerName="ovn-config" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.712039 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94ed339-8c96-46a9-be8b-7008b9c1385d" containerName="ovn-config" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.712206 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94ed339-8c96-46a9-be8b-7008b9c1385d" containerName="ovn-config" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.712757 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.714863 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.722516 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c6kng-config-cfxt5"] Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.850850 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jscxf\" (UniqueName: \"kubernetes.io/projected/8de54ce8-8960-4c51-85dc-8b439c1ac25c-kube-api-access-jscxf\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.851081 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8de54ce8-8960-4c51-85dc-8b439c1ac25c-scripts\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.851179 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-run\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.851274 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-run-ovn\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.851348 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8de54ce8-8960-4c51-85dc-8b439c1ac25c-additional-scripts\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.851414 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-log-ovn\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.855029 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a94ed339-8c96-46a9-be8b-7008b9c1385d" path="/var/lib/kubelet/pods/a94ed339-8c96-46a9-be8b-7008b9c1385d/volumes" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.952525 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8de54ce8-8960-4c51-85dc-8b439c1ac25c-additional-scripts\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.953604 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-log-ovn\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.954061 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jscxf\" (UniqueName: \"kubernetes.io/projected/8de54ce8-8960-4c51-85dc-8b439c1ac25c-kube-api-access-jscxf\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.954933 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8de54ce8-8960-4c51-85dc-8b439c1ac25c-scripts\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.955119 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-run\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.955226 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-run-ovn\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.955419 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-run-ovn\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.953982 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-log-ovn\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.953556 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8de54ce8-8960-4c51-85dc-8b439c1ac25c-additional-scripts\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.956252 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-run\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.957522 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8de54ce8-8960-4c51-85dc-8b439c1ac25c-scripts\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:02 crc kubenswrapper[4964]: I1004 02:56:02.985180 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jscxf\" (UniqueName: \"kubernetes.io/projected/8de54ce8-8960-4c51-85dc-8b439c1ac25c-kube-api-access-jscxf\") pod \"ovn-controller-c6kng-config-cfxt5\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:03 crc kubenswrapper[4964]: I1004 02:56:03.027985 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:03 crc kubenswrapper[4964]: I1004 02:56:03.053794 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-c6kng" Oct 04 02:56:03 crc kubenswrapper[4964]: I1004 02:56:03.483160 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjrgk" event={"ID":"f1021ace-0fb9-45a8-b83b-12a487b37bf3","Type":"ContainerStarted","Data":"ab9f5ea98e3d12b6a4b91a44a1e4e0ff87930559f5fa92bf15cf836f4e6d8e82"} Oct 04 02:56:03 crc kubenswrapper[4964]: I1004 02:56:03.525029 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c6kng-config-cfxt5"] Oct 04 02:56:04 crc kubenswrapper[4964]: I1004 02:56:04.449293 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:56:04 crc kubenswrapper[4964]: I1004 02:56:04.451079 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:56:04 crc kubenswrapper[4964]: I1004 02:56:04.497676 4964 generic.go:334] "Generic (PLEG): container finished" podID="8de54ce8-8960-4c51-85dc-8b439c1ac25c" containerID="e046e921c74c4d4ff280cc1bc9017c225b68c61f6c094fd319f351fcb4c8ccee" exitCode=0 Oct 04 02:56:04 crc kubenswrapper[4964]: I1004 02:56:04.497714 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6kng-config-cfxt5" event={"ID":"8de54ce8-8960-4c51-85dc-8b439c1ac25c","Type":"ContainerDied","Data":"e046e921c74c4d4ff280cc1bc9017c225b68c61f6c094fd319f351fcb4c8ccee"} Oct 04 02:56:04 crc kubenswrapper[4964]: I1004 02:56:04.497742 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6kng-config-cfxt5" event={"ID":"8de54ce8-8960-4c51-85dc-8b439c1ac25c","Type":"ContainerStarted","Data":"bb5a6930cd3332b9b5a310f2e7daee4d1958158bff7044fe14cca1c7b498f504"} Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.822148 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.908627 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-run-ovn\") pod \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.908684 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-log-ovn\") pod \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.908704 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-run\") pod \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.908760 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8de54ce8-8960-4c51-85dc-8b439c1ac25c" (UID: "8de54ce8-8960-4c51-85dc-8b439c1ac25c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.908788 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8de54ce8-8960-4c51-85dc-8b439c1ac25c" (UID: "8de54ce8-8960-4c51-85dc-8b439c1ac25c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.908831 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8de54ce8-8960-4c51-85dc-8b439c1ac25c-scripts\") pod \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.908903 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jscxf\" (UniqueName: \"kubernetes.io/projected/8de54ce8-8960-4c51-85dc-8b439c1ac25c-kube-api-access-jscxf\") pod \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.908948 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8de54ce8-8960-4c51-85dc-8b439c1ac25c-additional-scripts\") pod \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\" (UID: \"8de54ce8-8960-4c51-85dc-8b439c1ac25c\") " Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.908937 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-run" (OuterVolumeSpecName: "var-run") pod "8de54ce8-8960-4c51-85dc-8b439c1ac25c" (UID: "8de54ce8-8960-4c51-85dc-8b439c1ac25c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.909278 4964 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.909296 4964 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.909304 4964 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8de54ce8-8960-4c51-85dc-8b439c1ac25c-var-run\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.909961 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8de54ce8-8960-4c51-85dc-8b439c1ac25c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8de54ce8-8960-4c51-85dc-8b439c1ac25c" (UID: "8de54ce8-8960-4c51-85dc-8b439c1ac25c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.909983 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8de54ce8-8960-4c51-85dc-8b439c1ac25c-scripts" (OuterVolumeSpecName: "scripts") pod "8de54ce8-8960-4c51-85dc-8b439c1ac25c" (UID: "8de54ce8-8960-4c51-85dc-8b439c1ac25c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:05 crc kubenswrapper[4964]: I1004 02:56:05.918489 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de54ce8-8960-4c51-85dc-8b439c1ac25c-kube-api-access-jscxf" (OuterVolumeSpecName: "kube-api-access-jscxf") pod "8de54ce8-8960-4c51-85dc-8b439c1ac25c" (UID: "8de54ce8-8960-4c51-85dc-8b439c1ac25c"). InnerVolumeSpecName "kube-api-access-jscxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:06 crc kubenswrapper[4964]: I1004 02:56:06.010853 4964 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8de54ce8-8960-4c51-85dc-8b439c1ac25c-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:06 crc kubenswrapper[4964]: I1004 02:56:06.011239 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8de54ce8-8960-4c51-85dc-8b439c1ac25c-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:06 crc kubenswrapper[4964]: I1004 02:56:06.011253 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jscxf\" (UniqueName: \"kubernetes.io/projected/8de54ce8-8960-4c51-85dc-8b439c1ac25c-kube-api-access-jscxf\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:06 crc kubenswrapper[4964]: I1004 02:56:06.516847 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6kng-config-cfxt5" event={"ID":"8de54ce8-8960-4c51-85dc-8b439c1ac25c","Type":"ContainerDied","Data":"bb5a6930cd3332b9b5a310f2e7daee4d1958158bff7044fe14cca1c7b498f504"} Oct 04 02:56:06 crc kubenswrapper[4964]: I1004 02:56:06.516910 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb5a6930cd3332b9b5a310f2e7daee4d1958158bff7044fe14cca1c7b498f504" Oct 04 02:56:06 crc kubenswrapper[4964]: I1004 02:56:06.516924 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6kng-config-cfxt5" Oct 04 02:56:06 crc kubenswrapper[4964]: I1004 02:56:06.898060 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-c6kng-config-cfxt5"] Oct 04 02:56:06 crc kubenswrapper[4964]: I1004 02:56:06.904598 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-c6kng-config-cfxt5"] Oct 04 02:56:07 crc kubenswrapper[4964]: I1004 02:56:07.808969 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 04 02:56:08 crc kubenswrapper[4964]: I1004 02:56:08.856524 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de54ce8-8960-4c51-85dc-8b439c1ac25c" path="/var/lib/kubelet/pods/8de54ce8-8960-4c51-85dc-8b439c1ac25c/volumes" Oct 04 02:56:14 crc kubenswrapper[4964]: I1004 02:56:14.590483 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjrgk" event={"ID":"f1021ace-0fb9-45a8-b83b-12a487b37bf3","Type":"ContainerStarted","Data":"073961a4654faf47c49dbee30cc94672c0be75e3c5999128f54a47b58cf20d77"} Oct 04 02:56:14 crc kubenswrapper[4964]: I1004 02:56:14.617381 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vjrgk" podStartSLOduration=2.528811342 podStartE2EDuration="13.617368673s" podCreationTimestamp="2025-10-04 02:56:01 +0000 UTC" firstStartedPulling="2025-10-04 02:56:02.652828157 +0000 UTC m=+942.549786795" lastFinishedPulling="2025-10-04 02:56:13.741385458 +0000 UTC m=+953.638344126" observedRunningTime="2025-10-04 02:56:14.614813005 +0000 UTC m=+954.511771643" watchObservedRunningTime="2025-10-04 02:56:14.617368673 +0000 UTC m=+954.514327311" Oct 04 02:56:17 crc kubenswrapper[4964]: I1004 02:56:17.528748 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 04 02:56:17 crc kubenswrapper[4964]: I1004 02:56:17.868219 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-c4xjv"] Oct 04 02:56:17 crc kubenswrapper[4964]: E1004 02:56:17.868858 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de54ce8-8960-4c51-85dc-8b439c1ac25c" containerName="ovn-config" Oct 04 02:56:17 crc kubenswrapper[4964]: I1004 02:56:17.868881 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de54ce8-8960-4c51-85dc-8b439c1ac25c" containerName="ovn-config" Oct 04 02:56:17 crc kubenswrapper[4964]: I1004 02:56:17.869085 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de54ce8-8960-4c51-85dc-8b439c1ac25c" containerName="ovn-config" Oct 04 02:56:17 crc kubenswrapper[4964]: I1004 02:56:17.869630 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-c4xjv" Oct 04 02:56:17 crc kubenswrapper[4964]: I1004 02:56:17.880208 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-c4xjv"] Oct 04 02:56:17 crc kubenswrapper[4964]: I1004 02:56:17.924472 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxqj8\" (UniqueName: \"kubernetes.io/projected/58c86e7f-453a-4cc5-a487-cd5ada7f25d2-kube-api-access-kxqj8\") pod \"barbican-db-create-c4xjv\" (UID: \"58c86e7f-453a-4cc5-a487-cd5ada7f25d2\") " pod="openstack/barbican-db-create-c4xjv" Oct 04 02:56:17 crc kubenswrapper[4964]: I1004 02:56:17.968971 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-ck4kh"] Oct 04 02:56:17 crc kubenswrapper[4964]: I1004 02:56:17.970169 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ck4kh" Oct 04 02:56:17 crc kubenswrapper[4964]: I1004 02:56:17.977099 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ck4kh"] Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.026720 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxqj8\" (UniqueName: \"kubernetes.io/projected/58c86e7f-453a-4cc5-a487-cd5ada7f25d2-kube-api-access-kxqj8\") pod \"barbican-db-create-c4xjv\" (UID: \"58c86e7f-453a-4cc5-a487-cd5ada7f25d2\") " pod="openstack/barbican-db-create-c4xjv" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.026799 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbmpz\" (UniqueName: \"kubernetes.io/projected/1cc3e181-d921-4612-94f9-525ee8a91275-kube-api-access-pbmpz\") pod \"cinder-db-create-ck4kh\" (UID: \"1cc3e181-d921-4612-94f9-525ee8a91275\") " pod="openstack/cinder-db-create-ck4kh" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.056365 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxqj8\" (UniqueName: \"kubernetes.io/projected/58c86e7f-453a-4cc5-a487-cd5ada7f25d2-kube-api-access-kxqj8\") pod \"barbican-db-create-c4xjv\" (UID: \"58c86e7f-453a-4cc5-a487-cd5ada7f25d2\") " pod="openstack/barbican-db-create-c4xjv" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.128301 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbmpz\" (UniqueName: \"kubernetes.io/projected/1cc3e181-d921-4612-94f9-525ee8a91275-kube-api-access-pbmpz\") pod \"cinder-db-create-ck4kh\" (UID: \"1cc3e181-d921-4612-94f9-525ee8a91275\") " pod="openstack/cinder-db-create-ck4kh" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.146435 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbmpz\" (UniqueName: \"kubernetes.io/projected/1cc3e181-d921-4612-94f9-525ee8a91275-kube-api-access-pbmpz\") pod \"cinder-db-create-ck4kh\" (UID: \"1cc3e181-d921-4612-94f9-525ee8a91275\") " pod="openstack/cinder-db-create-ck4kh" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.184227 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-c4xjv" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.237373 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-45kld"] Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.238321 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.242448 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mslgh" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.242704 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.242927 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.243133 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.271502 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-45kld"] Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.283345 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ck4kh" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.284557 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-prrvx"] Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.285490 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-prrvx" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.325967 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-prrvx"] Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.338463 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qftt9\" (UniqueName: \"kubernetes.io/projected/34e16345-cf64-413c-a394-35c20d93aa02-kube-api-access-qftt9\") pod \"keystone-db-sync-45kld\" (UID: \"34e16345-cf64-413c-a394-35c20d93aa02\") " pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.338782 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e16345-cf64-413c-a394-35c20d93aa02-combined-ca-bundle\") pod \"keystone-db-sync-45kld\" (UID: \"34e16345-cf64-413c-a394-35c20d93aa02\") " pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.338871 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdtrq\" (UniqueName: \"kubernetes.io/projected/052d791b-de97-4d7c-b150-81e9fec1e0fc-kube-api-access-qdtrq\") pod \"neutron-db-create-prrvx\" (UID: \"052d791b-de97-4d7c-b150-81e9fec1e0fc\") " pod="openstack/neutron-db-create-prrvx" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.338993 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e16345-cf64-413c-a394-35c20d93aa02-config-data\") pod \"keystone-db-sync-45kld\" (UID: \"34e16345-cf64-413c-a394-35c20d93aa02\") " pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.462980 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qftt9\" (UniqueName: \"kubernetes.io/projected/34e16345-cf64-413c-a394-35c20d93aa02-kube-api-access-qftt9\") pod \"keystone-db-sync-45kld\" (UID: \"34e16345-cf64-413c-a394-35c20d93aa02\") " pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.463087 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e16345-cf64-413c-a394-35c20d93aa02-combined-ca-bundle\") pod \"keystone-db-sync-45kld\" (UID: \"34e16345-cf64-413c-a394-35c20d93aa02\") " pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.463104 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdtrq\" (UniqueName: \"kubernetes.io/projected/052d791b-de97-4d7c-b150-81e9fec1e0fc-kube-api-access-qdtrq\") pod \"neutron-db-create-prrvx\" (UID: \"052d791b-de97-4d7c-b150-81e9fec1e0fc\") " pod="openstack/neutron-db-create-prrvx" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.463140 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e16345-cf64-413c-a394-35c20d93aa02-config-data\") pod \"keystone-db-sync-45kld\" (UID: \"34e16345-cf64-413c-a394-35c20d93aa02\") " pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.470507 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e16345-cf64-413c-a394-35c20d93aa02-config-data\") pod \"keystone-db-sync-45kld\" (UID: \"34e16345-cf64-413c-a394-35c20d93aa02\") " pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.471117 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e16345-cf64-413c-a394-35c20d93aa02-combined-ca-bundle\") pod \"keystone-db-sync-45kld\" (UID: \"34e16345-cf64-413c-a394-35c20d93aa02\") " pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.479598 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qftt9\" (UniqueName: \"kubernetes.io/projected/34e16345-cf64-413c-a394-35c20d93aa02-kube-api-access-qftt9\") pod \"keystone-db-sync-45kld\" (UID: \"34e16345-cf64-413c-a394-35c20d93aa02\") " pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.481474 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdtrq\" (UniqueName: \"kubernetes.io/projected/052d791b-de97-4d7c-b150-81e9fec1e0fc-kube-api-access-qdtrq\") pod \"neutron-db-create-prrvx\" (UID: \"052d791b-de97-4d7c-b150-81e9fec1e0fc\") " pod="openstack/neutron-db-create-prrvx" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.626567 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ck4kh"] Oct 04 02:56:18 crc kubenswrapper[4964]: W1004 02:56:18.637020 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cc3e181_d921_4612_94f9_525ee8a91275.slice/crio-f2c259cbaa6e68e8341cdc272568be878ac0f71c64cd712e9a69349ce3e240c4 WatchSource:0}: Error finding container f2c259cbaa6e68e8341cdc272568be878ac0f71c64cd712e9a69349ce3e240c4: Status 404 returned error can't find the container with id f2c259cbaa6e68e8341cdc272568be878ac0f71c64cd712e9a69349ce3e240c4 Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.664170 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.669306 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-prrvx" Oct 04 02:56:18 crc kubenswrapper[4964]: I1004 02:56:18.746208 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-c4xjv"] Oct 04 02:56:19 crc kubenswrapper[4964]: I1004 02:56:19.026474 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-prrvx"] Oct 04 02:56:19 crc kubenswrapper[4964]: W1004 02:56:19.086107 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod052d791b_de97_4d7c_b150_81e9fec1e0fc.slice/crio-0359371d8763fc7ed3964d2d4516833fa8448e658d4fb0cddb9514536c6d7301 WatchSource:0}: Error finding container 0359371d8763fc7ed3964d2d4516833fa8448e658d4fb0cddb9514536c6d7301: Status 404 returned error can't find the container with id 0359371d8763fc7ed3964d2d4516833fa8448e658d4fb0cddb9514536c6d7301 Oct 04 02:56:19 crc kubenswrapper[4964]: I1004 02:56:19.185078 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-45kld"] Oct 04 02:56:19 crc kubenswrapper[4964]: W1004 02:56:19.189961 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34e16345_cf64_413c_a394_35c20d93aa02.slice/crio-72836cfb9f35c5e0c16747fb148d10c61e951795ec5557ecccc0230784215252 WatchSource:0}: Error finding container 72836cfb9f35c5e0c16747fb148d10c61e951795ec5557ecccc0230784215252: Status 404 returned error can't find the container with id 72836cfb9f35c5e0c16747fb148d10c61e951795ec5557ecccc0230784215252 Oct 04 02:56:19 crc kubenswrapper[4964]: I1004 02:56:19.637358 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-45kld" event={"ID":"34e16345-cf64-413c-a394-35c20d93aa02","Type":"ContainerStarted","Data":"72836cfb9f35c5e0c16747fb148d10c61e951795ec5557ecccc0230784215252"} Oct 04 02:56:19 crc kubenswrapper[4964]: I1004 02:56:19.642459 4964 generic.go:334] "Generic (PLEG): container finished" podID="58c86e7f-453a-4cc5-a487-cd5ada7f25d2" containerID="c50ac3544af7a9cfccb3d272e2dcae48a2ca1d92d4cff67360e26ca06ef705c1" exitCode=0 Oct 04 02:56:19 crc kubenswrapper[4964]: I1004 02:56:19.642546 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-c4xjv" event={"ID":"58c86e7f-453a-4cc5-a487-cd5ada7f25d2","Type":"ContainerDied","Data":"c50ac3544af7a9cfccb3d272e2dcae48a2ca1d92d4cff67360e26ca06ef705c1"} Oct 04 02:56:19 crc kubenswrapper[4964]: I1004 02:56:19.642578 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-c4xjv" event={"ID":"58c86e7f-453a-4cc5-a487-cd5ada7f25d2","Type":"ContainerStarted","Data":"854da1e716ef6e557bddce5743219afd1b27c63cf9678537543d72fa61b753dd"} Oct 04 02:56:19 crc kubenswrapper[4964]: I1004 02:56:19.645264 4964 generic.go:334] "Generic (PLEG): container finished" podID="1cc3e181-d921-4612-94f9-525ee8a91275" containerID="6c7cee255db4c04d0824d62a05ab8b73d451c8340899d7864ebaf68d71e70aab" exitCode=0 Oct 04 02:56:19 crc kubenswrapper[4964]: I1004 02:56:19.645362 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ck4kh" event={"ID":"1cc3e181-d921-4612-94f9-525ee8a91275","Type":"ContainerDied","Data":"6c7cee255db4c04d0824d62a05ab8b73d451c8340899d7864ebaf68d71e70aab"} Oct 04 02:56:19 crc kubenswrapper[4964]: I1004 02:56:19.645392 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ck4kh" event={"ID":"1cc3e181-d921-4612-94f9-525ee8a91275","Type":"ContainerStarted","Data":"f2c259cbaa6e68e8341cdc272568be878ac0f71c64cd712e9a69349ce3e240c4"} Oct 04 02:56:19 crc kubenswrapper[4964]: I1004 02:56:19.648320 4964 generic.go:334] "Generic (PLEG): container finished" podID="052d791b-de97-4d7c-b150-81e9fec1e0fc" containerID="776d04b224e16a7471121aba66de0fe29e36e9d403433e30608db87af4ea8708" exitCode=0 Oct 04 02:56:19 crc kubenswrapper[4964]: I1004 02:56:19.648378 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-prrvx" event={"ID":"052d791b-de97-4d7c-b150-81e9fec1e0fc","Type":"ContainerDied","Data":"776d04b224e16a7471121aba66de0fe29e36e9d403433e30608db87af4ea8708"} Oct 04 02:56:19 crc kubenswrapper[4964]: I1004 02:56:19.648415 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-prrvx" event={"ID":"052d791b-de97-4d7c-b150-81e9fec1e0fc","Type":"ContainerStarted","Data":"0359371d8763fc7ed3964d2d4516833fa8448e658d4fb0cddb9514536c6d7301"} Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.018171 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ck4kh" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.100112 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbmpz\" (UniqueName: \"kubernetes.io/projected/1cc3e181-d921-4612-94f9-525ee8a91275-kube-api-access-pbmpz\") pod \"1cc3e181-d921-4612-94f9-525ee8a91275\" (UID: \"1cc3e181-d921-4612-94f9-525ee8a91275\") " Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.107927 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc3e181-d921-4612-94f9-525ee8a91275-kube-api-access-pbmpz" (OuterVolumeSpecName: "kube-api-access-pbmpz") pod "1cc3e181-d921-4612-94f9-525ee8a91275" (UID: "1cc3e181-d921-4612-94f9-525ee8a91275"). InnerVolumeSpecName "kube-api-access-pbmpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.143905 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-prrvx" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.149157 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-c4xjv" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.201312 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxqj8\" (UniqueName: \"kubernetes.io/projected/58c86e7f-453a-4cc5-a487-cd5ada7f25d2-kube-api-access-kxqj8\") pod \"58c86e7f-453a-4cc5-a487-cd5ada7f25d2\" (UID: \"58c86e7f-453a-4cc5-a487-cd5ada7f25d2\") " Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.201348 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdtrq\" (UniqueName: \"kubernetes.io/projected/052d791b-de97-4d7c-b150-81e9fec1e0fc-kube-api-access-qdtrq\") pod \"052d791b-de97-4d7c-b150-81e9fec1e0fc\" (UID: \"052d791b-de97-4d7c-b150-81e9fec1e0fc\") " Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.201700 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbmpz\" (UniqueName: \"kubernetes.io/projected/1cc3e181-d921-4612-94f9-525ee8a91275-kube-api-access-pbmpz\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.209115 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/052d791b-de97-4d7c-b150-81e9fec1e0fc-kube-api-access-qdtrq" (OuterVolumeSpecName: "kube-api-access-qdtrq") pod "052d791b-de97-4d7c-b150-81e9fec1e0fc" (UID: "052d791b-de97-4d7c-b150-81e9fec1e0fc"). InnerVolumeSpecName "kube-api-access-qdtrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.210169 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c86e7f-453a-4cc5-a487-cd5ada7f25d2-kube-api-access-kxqj8" (OuterVolumeSpecName: "kube-api-access-kxqj8") pod "58c86e7f-453a-4cc5-a487-cd5ada7f25d2" (UID: "58c86e7f-453a-4cc5-a487-cd5ada7f25d2"). InnerVolumeSpecName "kube-api-access-kxqj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.303158 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxqj8\" (UniqueName: \"kubernetes.io/projected/58c86e7f-453a-4cc5-a487-cd5ada7f25d2-kube-api-access-kxqj8\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.303204 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdtrq\" (UniqueName: \"kubernetes.io/projected/052d791b-de97-4d7c-b150-81e9fec1e0fc-kube-api-access-qdtrq\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.664005 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-prrvx" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.664027 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-prrvx" event={"ID":"052d791b-de97-4d7c-b150-81e9fec1e0fc","Type":"ContainerDied","Data":"0359371d8763fc7ed3964d2d4516833fa8448e658d4fb0cddb9514536c6d7301"} Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.664064 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0359371d8763fc7ed3964d2d4516833fa8448e658d4fb0cddb9514536c6d7301" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.666898 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-c4xjv" event={"ID":"58c86e7f-453a-4cc5-a487-cd5ada7f25d2","Type":"ContainerDied","Data":"854da1e716ef6e557bddce5743219afd1b27c63cf9678537543d72fa61b753dd"} Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.666931 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="854da1e716ef6e557bddce5743219afd1b27c63cf9678537543d72fa61b753dd" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.666954 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-c4xjv" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.670762 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ck4kh" event={"ID":"1cc3e181-d921-4612-94f9-525ee8a91275","Type":"ContainerDied","Data":"f2c259cbaa6e68e8341cdc272568be878ac0f71c64cd712e9a69349ce3e240c4"} Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.670787 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2c259cbaa6e68e8341cdc272568be878ac0f71c64cd712e9a69349ce3e240c4" Oct 04 02:56:21 crc kubenswrapper[4964]: I1004 02:56:21.670819 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ck4kh" Oct 04 02:56:22 crc kubenswrapper[4964]: I1004 02:56:22.680878 4964 generic.go:334] "Generic (PLEG): container finished" podID="f1021ace-0fb9-45a8-b83b-12a487b37bf3" containerID="073961a4654faf47c49dbee30cc94672c0be75e3c5999128f54a47b58cf20d77" exitCode=0 Oct 04 02:56:22 crc kubenswrapper[4964]: I1004 02:56:22.680978 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjrgk" event={"ID":"f1021ace-0fb9-45a8-b83b-12a487b37bf3","Type":"ContainerDied","Data":"073961a4654faf47c49dbee30cc94672c0be75e3c5999128f54a47b58cf20d77"} Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.190888 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.364237 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-config-data\") pod \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.364776 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-db-sync-config-data\") pod \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.365807 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-combined-ca-bundle\") pod \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.365883 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l5bh\" (UniqueName: \"kubernetes.io/projected/f1021ace-0fb9-45a8-b83b-12a487b37bf3-kube-api-access-7l5bh\") pod \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\" (UID: \"f1021ace-0fb9-45a8-b83b-12a487b37bf3\") " Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.370987 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1021ace-0fb9-45a8-b83b-12a487b37bf3-kube-api-access-7l5bh" (OuterVolumeSpecName: "kube-api-access-7l5bh") pod "f1021ace-0fb9-45a8-b83b-12a487b37bf3" (UID: "f1021ace-0fb9-45a8-b83b-12a487b37bf3"). InnerVolumeSpecName "kube-api-access-7l5bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.372068 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f1021ace-0fb9-45a8-b83b-12a487b37bf3" (UID: "f1021ace-0fb9-45a8-b83b-12a487b37bf3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.408154 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1021ace-0fb9-45a8-b83b-12a487b37bf3" (UID: "f1021ace-0fb9-45a8-b83b-12a487b37bf3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.410813 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-config-data" (OuterVolumeSpecName: "config-data") pod "f1021ace-0fb9-45a8-b83b-12a487b37bf3" (UID: "f1021ace-0fb9-45a8-b83b-12a487b37bf3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.467574 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l5bh\" (UniqueName: \"kubernetes.io/projected/f1021ace-0fb9-45a8-b83b-12a487b37bf3-kube-api-access-7l5bh\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.467686 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.467716 4964 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.467743 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1021ace-0fb9-45a8-b83b-12a487b37bf3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.704839 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjrgk" event={"ID":"f1021ace-0fb9-45a8-b83b-12a487b37bf3","Type":"ContainerDied","Data":"ab9f5ea98e3d12b6a4b91a44a1e4e0ff87930559f5fa92bf15cf836f4e6d8e82"} Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.704881 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab9f5ea98e3d12b6a4b91a44a1e4e0ff87930559f5fa92bf15cf836f4e6d8e82" Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.704912 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjrgk" Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.707780 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-45kld" event={"ID":"34e16345-cf64-413c-a394-35c20d93aa02","Type":"ContainerStarted","Data":"15598fc34d35e2a401f8c568b75aa8bb19da3754d84279443a3b1b103d206005"} Oct 04 02:56:24 crc kubenswrapper[4964]: I1004 02:56:24.755357 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-45kld" podStartSLOduration=1.76116225 podStartE2EDuration="6.755336514s" podCreationTimestamp="2025-10-04 02:56:18 +0000 UTC" firstStartedPulling="2025-10-04 02:56:19.19255226 +0000 UTC m=+959.089510898" lastFinishedPulling="2025-10-04 02:56:24.186726524 +0000 UTC m=+964.083685162" observedRunningTime="2025-10-04 02:56:24.742218531 +0000 UTC m=+964.639177169" watchObservedRunningTime="2025-10-04 02:56:24.755336514 +0000 UTC m=+964.652295152" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.178696 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-sxcq5"] Oct 04 02:56:25 crc kubenswrapper[4964]: E1004 02:56:25.179474 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="052d791b-de97-4d7c-b150-81e9fec1e0fc" containerName="mariadb-database-create" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.179500 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="052d791b-de97-4d7c-b150-81e9fec1e0fc" containerName="mariadb-database-create" Oct 04 02:56:25 crc kubenswrapper[4964]: E1004 02:56:25.179522 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc3e181-d921-4612-94f9-525ee8a91275" containerName="mariadb-database-create" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.179528 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc3e181-d921-4612-94f9-525ee8a91275" containerName="mariadb-database-create" Oct 04 02:56:25 crc kubenswrapper[4964]: E1004 02:56:25.179555 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1021ace-0fb9-45a8-b83b-12a487b37bf3" containerName="glance-db-sync" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.179562 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1021ace-0fb9-45a8-b83b-12a487b37bf3" containerName="glance-db-sync" Oct 04 02:56:25 crc kubenswrapper[4964]: E1004 02:56:25.179575 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c86e7f-453a-4cc5-a487-cd5ada7f25d2" containerName="mariadb-database-create" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.179582 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c86e7f-453a-4cc5-a487-cd5ada7f25d2" containerName="mariadb-database-create" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.180087 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="052d791b-de97-4d7c-b150-81e9fec1e0fc" containerName="mariadb-database-create" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.180118 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1021ace-0fb9-45a8-b83b-12a487b37bf3" containerName="glance-db-sync" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.180129 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c86e7f-453a-4cc5-a487-cd5ada7f25d2" containerName="mariadb-database-create" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.180149 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc3e181-d921-4612-94f9-525ee8a91275" containerName="mariadb-database-create" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.181467 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.217068 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-sxcq5"] Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.286436 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.286595 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzm87\" (UniqueName: \"kubernetes.io/projected/868df0b1-4478-408f-9f71-b829873b6ca9-kube-api-access-bzm87\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.286839 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-config\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.287290 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.287360 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.388813 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.389296 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzm87\" (UniqueName: \"kubernetes.io/projected/868df0b1-4478-408f-9f71-b829873b6ca9-kube-api-access-bzm87\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.389457 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-config\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.389602 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.389722 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.389821 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.390635 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.390651 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.390943 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-config\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.409197 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzm87\" (UniqueName: \"kubernetes.io/projected/868df0b1-4478-408f-9f71-b829873b6ca9-kube-api-access-bzm87\") pod \"dnsmasq-dns-54f9b7b8d9-sxcq5\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.504829 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:25 crc kubenswrapper[4964]: I1004 02:56:25.987566 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-sxcq5"] Oct 04 02:56:26 crc kubenswrapper[4964]: I1004 02:56:26.727770 4964 generic.go:334] "Generic (PLEG): container finished" podID="868df0b1-4478-408f-9f71-b829873b6ca9" containerID="8378bfa5c5e367ce84b59512ca36d46a434bfbbf024d508c4e8f2ced165bdc53" exitCode=0 Oct 04 02:56:26 crc kubenswrapper[4964]: I1004 02:56:26.727840 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" event={"ID":"868df0b1-4478-408f-9f71-b829873b6ca9","Type":"ContainerDied","Data":"8378bfa5c5e367ce84b59512ca36d46a434bfbbf024d508c4e8f2ced165bdc53"} Oct 04 02:56:26 crc kubenswrapper[4964]: I1004 02:56:26.728131 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" event={"ID":"868df0b1-4478-408f-9f71-b829873b6ca9","Type":"ContainerStarted","Data":"ae3bbf99e3639c8123e9792930d9829a7c26268674784a2894e98ee701ae4699"} Oct 04 02:56:27 crc kubenswrapper[4964]: I1004 02:56:27.740745 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" event={"ID":"868df0b1-4478-408f-9f71-b829873b6ca9","Type":"ContainerStarted","Data":"540409652c96b13755fa9c7de3fd553c9e0042f8c0e4b312edea646b6a031bfe"} Oct 04 02:56:27 crc kubenswrapper[4964]: I1004 02:56:27.741172 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:27 crc kubenswrapper[4964]: I1004 02:56:27.743105 4964 generic.go:334] "Generic (PLEG): container finished" podID="34e16345-cf64-413c-a394-35c20d93aa02" containerID="15598fc34d35e2a401f8c568b75aa8bb19da3754d84279443a3b1b103d206005" exitCode=0 Oct 04 02:56:27 crc kubenswrapper[4964]: I1004 02:56:27.743174 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-45kld" event={"ID":"34e16345-cf64-413c-a394-35c20d93aa02","Type":"ContainerDied","Data":"15598fc34d35e2a401f8c568b75aa8bb19da3754d84279443a3b1b103d206005"} Oct 04 02:56:27 crc kubenswrapper[4964]: I1004 02:56:27.777773 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" podStartSLOduration=2.777207832 podStartE2EDuration="2.777207832s" podCreationTimestamp="2025-10-04 02:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:56:27.764688915 +0000 UTC m=+967.661647593" watchObservedRunningTime="2025-10-04 02:56:27.777207832 +0000 UTC m=+967.674166510" Oct 04 02:56:27 crc kubenswrapper[4964]: I1004 02:56:27.961984 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4f8a-account-create-flfbn"] Oct 04 02:56:27 crc kubenswrapper[4964]: I1004 02:56:27.963456 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4f8a-account-create-flfbn" Oct 04 02:56:27 crc kubenswrapper[4964]: I1004 02:56:27.967814 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 04 02:56:27 crc kubenswrapper[4964]: I1004 02:56:27.986577 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4f8a-account-create-flfbn"] Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.060442 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6c4d-account-create-rv5nz"] Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.061894 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6c4d-account-create-rv5nz" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.064887 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.072422 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6c4d-account-create-rv5nz"] Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.133233 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb6mr\" (UniqueName: \"kubernetes.io/projected/26dddf74-bd10-4534-9c12-8fd8c9311475-kube-api-access-xb6mr\") pod \"cinder-4f8a-account-create-flfbn\" (UID: \"26dddf74-bd10-4534-9c12-8fd8c9311475\") " pod="openstack/cinder-4f8a-account-create-flfbn" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.235407 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94c2g\" (UniqueName: \"kubernetes.io/projected/e2fdf522-a1fc-4ce6-a819-533ac61f5a5d-kube-api-access-94c2g\") pod \"barbican-6c4d-account-create-rv5nz\" (UID: \"e2fdf522-a1fc-4ce6-a819-533ac61f5a5d\") " pod="openstack/barbican-6c4d-account-create-rv5nz" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.235585 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb6mr\" (UniqueName: \"kubernetes.io/projected/26dddf74-bd10-4534-9c12-8fd8c9311475-kube-api-access-xb6mr\") pod \"cinder-4f8a-account-create-flfbn\" (UID: \"26dddf74-bd10-4534-9c12-8fd8c9311475\") " pod="openstack/cinder-4f8a-account-create-flfbn" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.292413 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0b5c-account-create-nwmgs"] Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.293289 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0b5c-account-create-nwmgs" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.293740 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb6mr\" (UniqueName: \"kubernetes.io/projected/26dddf74-bd10-4534-9c12-8fd8c9311475-kube-api-access-xb6mr\") pod \"cinder-4f8a-account-create-flfbn\" (UID: \"26dddf74-bd10-4534-9c12-8fd8c9311475\") " pod="openstack/cinder-4f8a-account-create-flfbn" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.296800 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.307508 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4f8a-account-create-flfbn" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.338070 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94c2g\" (UniqueName: \"kubernetes.io/projected/e2fdf522-a1fc-4ce6-a819-533ac61f5a5d-kube-api-access-94c2g\") pod \"barbican-6c4d-account-create-rv5nz\" (UID: \"e2fdf522-a1fc-4ce6-a819-533ac61f5a5d\") " pod="openstack/barbican-6c4d-account-create-rv5nz" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.387493 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0b5c-account-create-nwmgs"] Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.418474 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94c2g\" (UniqueName: \"kubernetes.io/projected/e2fdf522-a1fc-4ce6-a819-533ac61f5a5d-kube-api-access-94c2g\") pod \"barbican-6c4d-account-create-rv5nz\" (UID: \"e2fdf522-a1fc-4ce6-a819-533ac61f5a5d\") " pod="openstack/barbican-6c4d-account-create-rv5nz" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.440391 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-975kc\" (UniqueName: \"kubernetes.io/projected/81b5818e-cfbf-49bc-995f-b7024d46b020-kube-api-access-975kc\") pod \"neutron-0b5c-account-create-nwmgs\" (UID: \"81b5818e-cfbf-49bc-995f-b7024d46b020\") " pod="openstack/neutron-0b5c-account-create-nwmgs" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.542381 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-975kc\" (UniqueName: \"kubernetes.io/projected/81b5818e-cfbf-49bc-995f-b7024d46b020-kube-api-access-975kc\") pod \"neutron-0b5c-account-create-nwmgs\" (UID: \"81b5818e-cfbf-49bc-995f-b7024d46b020\") " pod="openstack/neutron-0b5c-account-create-nwmgs" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.558731 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-975kc\" (UniqueName: \"kubernetes.io/projected/81b5818e-cfbf-49bc-995f-b7024d46b020-kube-api-access-975kc\") pod \"neutron-0b5c-account-create-nwmgs\" (UID: \"81b5818e-cfbf-49bc-995f-b7024d46b020\") " pod="openstack/neutron-0b5c-account-create-nwmgs" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.678268 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6c4d-account-create-rv5nz" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.751112 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0b5c-account-create-nwmgs" Oct 04 02:56:28 crc kubenswrapper[4964]: I1004 02:56:28.878713 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4f8a-account-create-flfbn"] Oct 04 02:56:28 crc kubenswrapper[4964]: W1004 02:56:28.887113 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26dddf74_bd10_4534_9c12_8fd8c9311475.slice/crio-d50549c39cdf2b9a67d3be712e517bd7b8d7864218c34c20f8c65fb9aa85036a WatchSource:0}: Error finding container d50549c39cdf2b9a67d3be712e517bd7b8d7864218c34c20f8c65fb9aa85036a: Status 404 returned error can't find the container with id d50549c39cdf2b9a67d3be712e517bd7b8d7864218c34c20f8c65fb9aa85036a Oct 04 02:56:29 crc kubenswrapper[4964]: I1004 02:56:29.209917 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6c4d-account-create-rv5nz"] Oct 04 02:56:29 crc kubenswrapper[4964]: I1004 02:56:29.271281 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0b5c-account-create-nwmgs"] Oct 04 02:56:29 crc kubenswrapper[4964]: W1004 02:56:29.276009 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81b5818e_cfbf_49bc_995f_b7024d46b020.slice/crio-a2ba8bb1c9c3e8bfd05fee5ae7f50380fa2d1acabace230a59efdbddbeca6140 WatchSource:0}: Error finding container a2ba8bb1c9c3e8bfd05fee5ae7f50380fa2d1acabace230a59efdbddbeca6140: Status 404 returned error can't find the container with id a2ba8bb1c9c3e8bfd05fee5ae7f50380fa2d1acabace230a59efdbddbeca6140 Oct 04 02:56:29 crc kubenswrapper[4964]: I1004 02:56:29.761172 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6c4d-account-create-rv5nz" event={"ID":"e2fdf522-a1fc-4ce6-a819-533ac61f5a5d","Type":"ContainerStarted","Data":"2133608917d485ffa36e64778d7c1b0048a81fd20cf628dfa81259c5d58b717a"} Oct 04 02:56:29 crc kubenswrapper[4964]: I1004 02:56:29.763956 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0b5c-account-create-nwmgs" event={"ID":"81b5818e-cfbf-49bc-995f-b7024d46b020","Type":"ContainerStarted","Data":"a2ba8bb1c9c3e8bfd05fee5ae7f50380fa2d1acabace230a59efdbddbeca6140"} Oct 04 02:56:29 crc kubenswrapper[4964]: I1004 02:56:29.765576 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4f8a-account-create-flfbn" event={"ID":"26dddf74-bd10-4534-9c12-8fd8c9311475","Type":"ContainerStarted","Data":"d50549c39cdf2b9a67d3be712e517bd7b8d7864218c34c20f8c65fb9aa85036a"} Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.048331 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.174561 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e16345-cf64-413c-a394-35c20d93aa02-combined-ca-bundle\") pod \"34e16345-cf64-413c-a394-35c20d93aa02\" (UID: \"34e16345-cf64-413c-a394-35c20d93aa02\") " Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.174751 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qftt9\" (UniqueName: \"kubernetes.io/projected/34e16345-cf64-413c-a394-35c20d93aa02-kube-api-access-qftt9\") pod \"34e16345-cf64-413c-a394-35c20d93aa02\" (UID: \"34e16345-cf64-413c-a394-35c20d93aa02\") " Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.174810 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e16345-cf64-413c-a394-35c20d93aa02-config-data\") pod \"34e16345-cf64-413c-a394-35c20d93aa02\" (UID: \"34e16345-cf64-413c-a394-35c20d93aa02\") " Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.182761 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e16345-cf64-413c-a394-35c20d93aa02-kube-api-access-qftt9" (OuterVolumeSpecName: "kube-api-access-qftt9") pod "34e16345-cf64-413c-a394-35c20d93aa02" (UID: "34e16345-cf64-413c-a394-35c20d93aa02"). InnerVolumeSpecName "kube-api-access-qftt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.218922 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e16345-cf64-413c-a394-35c20d93aa02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34e16345-cf64-413c-a394-35c20d93aa02" (UID: "34e16345-cf64-413c-a394-35c20d93aa02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.239523 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e16345-cf64-413c-a394-35c20d93aa02-config-data" (OuterVolumeSpecName: "config-data") pod "34e16345-cf64-413c-a394-35c20d93aa02" (UID: "34e16345-cf64-413c-a394-35c20d93aa02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.277472 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e16345-cf64-413c-a394-35c20d93aa02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.277520 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qftt9\" (UniqueName: \"kubernetes.io/projected/34e16345-cf64-413c-a394-35c20d93aa02-kube-api-access-qftt9\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.277533 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e16345-cf64-413c-a394-35c20d93aa02-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.784306 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-45kld" event={"ID":"34e16345-cf64-413c-a394-35c20d93aa02","Type":"ContainerDied","Data":"72836cfb9f35c5e0c16747fb148d10c61e951795ec5557ecccc0230784215252"} Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.784367 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72836cfb9f35c5e0c16747fb148d10c61e951795ec5557ecccc0230784215252" Oct 04 02:56:30 crc kubenswrapper[4964]: I1004 02:56:30.784454 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-45kld" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.340294 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-sxcq5"] Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.343907 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" podUID="868df0b1-4478-408f-9f71-b829873b6ca9" containerName="dnsmasq-dns" containerID="cri-o://540409652c96b13755fa9c7de3fd553c9e0042f8c0e4b312edea646b6a031bfe" gracePeriod=10 Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.355088 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.386013 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-plfh6"] Oct 04 02:56:31 crc kubenswrapper[4964]: E1004 02:56:31.386337 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e16345-cf64-413c-a394-35c20d93aa02" containerName="keystone-db-sync" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.386352 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e16345-cf64-413c-a394-35c20d93aa02" containerName="keystone-db-sync" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.386520 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e16345-cf64-413c-a394-35c20d93aa02" containerName="keystone-db-sync" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.387098 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.390046 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.390183 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mslgh" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.390288 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.390398 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.395334 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-lmjwf"] Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.396665 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.426863 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-plfh6"] Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.469587 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-lmjwf"] Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.502312 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-scripts\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.502357 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-config\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.502407 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbrs5\" (UniqueName: \"kubernetes.io/projected/fe3803ca-5503-427a-8c0c-c93684d02f5d-kube-api-access-vbrs5\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.502427 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5fqz\" (UniqueName: \"kubernetes.io/projected/e87eb546-663e-4f10-bedb-6101fd30b384-kube-api-access-m5fqz\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.502459 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.502476 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-dns-svc\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.502494 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-config-data\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.502514 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-credential-keys\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.502537 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.502564 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-combined-ca-bundle\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.502660 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-fernet-keys\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.604282 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-scripts\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.604331 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-config\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.604381 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbrs5\" (UniqueName: \"kubernetes.io/projected/fe3803ca-5503-427a-8c0c-c93684d02f5d-kube-api-access-vbrs5\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.604399 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5fqz\" (UniqueName: \"kubernetes.io/projected/e87eb546-663e-4f10-bedb-6101fd30b384-kube-api-access-m5fqz\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.604428 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.604445 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-dns-svc\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.604461 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-config-data\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.604478 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-credential-keys\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.604495 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.604515 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-combined-ca-bundle\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.604572 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-fernet-keys\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.605420 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-config\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.606377 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-dns-svc\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.607548 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.610641 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.612419 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-fernet-keys\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.614099 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-credential-keys\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.621428 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-config-data\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.622974 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-scripts\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.623387 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-combined-ca-bundle\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.628773 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbrs5\" (UniqueName: \"kubernetes.io/projected/fe3803ca-5503-427a-8c0c-c93684d02f5d-kube-api-access-vbrs5\") pod \"dnsmasq-dns-6546db6db7-lmjwf\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.640876 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-nl7sp"] Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.641865 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.645160 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5fqz\" (UniqueName: \"kubernetes.io/projected/e87eb546-663e-4f10-bedb-6101fd30b384-kube-api-access-m5fqz\") pod \"keystone-bootstrap-plfh6\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.658341 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.661570 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.661703 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-96ccz" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.667399 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nl7sp"] Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.679669 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-lmjwf"] Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.680236 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.686509 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.689061 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.701318 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.701439 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.703680 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.715054 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.735912 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-p8tj2"] Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.738858 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.752687 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-p8tj2"] Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.809531 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl8dd\" (UniqueName: \"kubernetes.io/projected/8e531d8d-5769-4736-b8cd-0121cc087e2e-kube-api-access-fl8dd\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.809574 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-scripts\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.809591 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-config-data\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.809640 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-combined-ca-bundle\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.809656 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e531d8d-5769-4736-b8cd-0121cc087e2e-log-httpd\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.809686 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.809708 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.809730 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-scripts\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.809749 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-config-data\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.809782 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e531d8d-5769-4736-b8cd-0121cc087e2e-run-httpd\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.809806 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da802d94-8d77-4b2c-88a0-3edc6e7c115b-logs\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.809832 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz4gc\" (UniqueName: \"kubernetes.io/projected/da802d94-8d77-4b2c-88a0-3edc6e7c115b-kube-api-access-kz4gc\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.828466 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0b5c-account-create-nwmgs" event={"ID":"81b5818e-cfbf-49bc-995f-b7024d46b020","Type":"ContainerStarted","Data":"ae4aa1114a4130bec97e3f870bdf46e359790cb91d51b471447033f7b62ec2ca"} Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.832961 4964 generic.go:334] "Generic (PLEG): container finished" podID="868df0b1-4478-408f-9f71-b829873b6ca9" containerID="540409652c96b13755fa9c7de3fd553c9e0042f8c0e4b312edea646b6a031bfe" exitCode=0 Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.833006 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" event={"ID":"868df0b1-4478-408f-9f71-b829873b6ca9","Type":"ContainerDied","Data":"540409652c96b13755fa9c7de3fd553c9e0042f8c0e4b312edea646b6a031bfe"} Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.835467 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4f8a-account-create-flfbn" event={"ID":"26dddf74-bd10-4534-9c12-8fd8c9311475","Type":"ContainerStarted","Data":"e68be7bfb795edc45c2a960b9d954fe9c1136dd9361a3c072827abb06f40bf86"} Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.836709 4964 generic.go:334] "Generic (PLEG): container finished" podID="e2fdf522-a1fc-4ce6-a819-533ac61f5a5d" containerID="f65cf10e157fc8a8d0f0471e5ad0d73fb0c460ac3b9630ce9cc8295cd2f61e5a" exitCode=0 Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.836747 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6c4d-account-create-rv5nz" event={"ID":"e2fdf522-a1fc-4ce6-a819-533ac61f5a5d","Type":"ContainerDied","Data":"f65cf10e157fc8a8d0f0471e5ad0d73fb0c460ac3b9630ce9cc8295cd2f61e5a"} Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.846729 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-0b5c-account-create-nwmgs" podStartSLOduration=3.846712391 podStartE2EDuration="3.846712391s" podCreationTimestamp="2025-10-04 02:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:56:31.844647925 +0000 UTC m=+971.741606563" watchObservedRunningTime="2025-10-04 02:56:31.846712391 +0000 UTC m=+971.743671019" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911310 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911355 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-combined-ca-bundle\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911373 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e531d8d-5769-4736-b8cd-0121cc087e2e-log-httpd\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911396 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911424 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911447 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911467 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-scripts\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911605 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-config-data\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911652 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911671 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e531d8d-5769-4736-b8cd-0121cc087e2e-run-httpd\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911695 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da802d94-8d77-4b2c-88a0-3edc6e7c115b-logs\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911813 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz5p6\" (UniqueName: \"kubernetes.io/projected/488b1daa-7288-46cf-b351-2d6cec22c917-kube-api-access-rz5p6\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911857 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-config\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911882 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz4gc\" (UniqueName: \"kubernetes.io/projected/da802d94-8d77-4b2c-88a0-3edc6e7c115b-kube-api-access-kz4gc\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.911979 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl8dd\" (UniqueName: \"kubernetes.io/projected/8e531d8d-5769-4736-b8cd-0121cc087e2e-kube-api-access-fl8dd\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.912003 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-scripts\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.912026 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-config-data\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.912462 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da802d94-8d77-4b2c-88a0-3edc6e7c115b-logs\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.912533 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e531d8d-5769-4736-b8cd-0121cc087e2e-run-httpd\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.913969 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e531d8d-5769-4736-b8cd-0121cc087e2e-log-httpd\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.916580 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-config-data\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.920737 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.921573 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-scripts\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.922391 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-config-data\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.922768 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-scripts\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.924909 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-combined-ca-bundle\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.927494 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.928125 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz4gc\" (UniqueName: \"kubernetes.io/projected/da802d94-8d77-4b2c-88a0-3edc6e7c115b-kube-api-access-kz4gc\") pod \"placement-db-sync-nl7sp\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.930215 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl8dd\" (UniqueName: \"kubernetes.io/projected/8e531d8d-5769-4736-b8cd-0121cc087e2e-kube-api-access-fl8dd\") pod \"ceilometer-0\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " pod="openstack/ceilometer-0" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.953607 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:31 crc kubenswrapper[4964]: I1004 02:56:31.976904 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.013694 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.013758 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz5p6\" (UniqueName: \"kubernetes.io/projected/488b1daa-7288-46cf-b351-2d6cec22c917-kube-api-access-rz5p6\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.013777 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-config\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.014146 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.014198 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.015099 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.016078 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.018544 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-config\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.018694 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.035601 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz5p6\" (UniqueName: \"kubernetes.io/projected/488b1daa-7288-46cf-b351-2d6cec22c917-kube-api-access-rz5p6\") pod \"dnsmasq-dns-7987f74bbc-p8tj2\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.210860 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-lmjwf"] Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.286770 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.301344 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.317548 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-plfh6"] Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.423296 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-ovsdbserver-nb\") pod \"868df0b1-4478-408f-9f71-b829873b6ca9\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.423594 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzm87\" (UniqueName: \"kubernetes.io/projected/868df0b1-4478-408f-9f71-b829873b6ca9-kube-api-access-bzm87\") pod \"868df0b1-4478-408f-9f71-b829873b6ca9\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.423647 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-ovsdbserver-sb\") pod \"868df0b1-4478-408f-9f71-b829873b6ca9\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.423685 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-dns-svc\") pod \"868df0b1-4478-408f-9f71-b829873b6ca9\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.423732 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-config\") pod \"868df0b1-4478-408f-9f71-b829873b6ca9\" (UID: \"868df0b1-4478-408f-9f71-b829873b6ca9\") " Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.427808 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868df0b1-4478-408f-9f71-b829873b6ca9-kube-api-access-bzm87" (OuterVolumeSpecName: "kube-api-access-bzm87") pod "868df0b1-4478-408f-9f71-b829873b6ca9" (UID: "868df0b1-4478-408f-9f71-b829873b6ca9"). InnerVolumeSpecName "kube-api-access-bzm87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.505974 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "868df0b1-4478-408f-9f71-b829873b6ca9" (UID: "868df0b1-4478-408f-9f71-b829873b6ca9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.506227 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nl7sp"] Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.513191 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "868df0b1-4478-408f-9f71-b829873b6ca9" (UID: "868df0b1-4478-408f-9f71-b829873b6ca9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.514885 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.525065 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.525092 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzm87\" (UniqueName: \"kubernetes.io/projected/868df0b1-4478-408f-9f71-b829873b6ca9-kube-api-access-bzm87\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.525103 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.525106 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "868df0b1-4478-408f-9f71-b829873b6ca9" (UID: "868df0b1-4478-408f-9f71-b829873b6ca9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.530976 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-config" (OuterVolumeSpecName: "config") pod "868df0b1-4478-408f-9f71-b829873b6ca9" (UID: "868df0b1-4478-408f-9f71-b829873b6ca9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.626595 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.626644 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/868df0b1-4478-408f-9f71-b829873b6ca9-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.783728 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-p8tj2"] Oct 04 02:56:32 crc kubenswrapper[4964]: W1004 02:56:32.791038 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod488b1daa_7288_46cf_b351_2d6cec22c917.slice/crio-bfff716ee2e055bdb0b8b0f9e9818d334ac5b0ea34005d0190d19bf146fea0c4 WatchSource:0}: Error finding container bfff716ee2e055bdb0b8b0f9e9818d334ac5b0ea34005d0190d19bf146fea0c4: Status 404 returned error can't find the container with id bfff716ee2e055bdb0b8b0f9e9818d334ac5b0ea34005d0190d19bf146fea0c4 Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.848370 4964 generic.go:334] "Generic (PLEG): container finished" podID="fe3803ca-5503-427a-8c0c-c93684d02f5d" containerID="37477af2f7b2d41c2c0a431f3768c6b2d3f6ee110c95bb7b2fe8cd0232febde2" exitCode=0 Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.850408 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.861420 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" event={"ID":"488b1daa-7288-46cf-b351-2d6cec22c917","Type":"ContainerStarted","Data":"bfff716ee2e055bdb0b8b0f9e9818d334ac5b0ea34005d0190d19bf146fea0c4"} Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.861463 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" event={"ID":"fe3803ca-5503-427a-8c0c-c93684d02f5d","Type":"ContainerDied","Data":"37477af2f7b2d41c2c0a431f3768c6b2d3f6ee110c95bb7b2fe8cd0232febde2"} Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.861482 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" event={"ID":"fe3803ca-5503-427a-8c0c-c93684d02f5d","Type":"ContainerStarted","Data":"9908d96e337d4e75dba9500269e6d6b006b5cde209f6ec17b7c0ac17ecd8b647"} Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.861493 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-sxcq5" event={"ID":"868df0b1-4478-408f-9f71-b829873b6ca9","Type":"ContainerDied","Data":"ae3bbf99e3639c8123e9792930d9829a7c26268674784a2894e98ee701ae4699"} Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.861516 4964 scope.go:117] "RemoveContainer" containerID="540409652c96b13755fa9c7de3fd553c9e0042f8c0e4b312edea646b6a031bfe" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.864501 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nl7sp" event={"ID":"da802d94-8d77-4b2c-88a0-3edc6e7c115b","Type":"ContainerStarted","Data":"9100f6beb53e203a5c2ae1b872264bce26e90c5326471a67bba48a787255f18d"} Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.866820 4964 generic.go:334] "Generic (PLEG): container finished" podID="26dddf74-bd10-4534-9c12-8fd8c9311475" containerID="e68be7bfb795edc45c2a960b9d954fe9c1136dd9361a3c072827abb06f40bf86" exitCode=0 Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.866881 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4f8a-account-create-flfbn" event={"ID":"26dddf74-bd10-4534-9c12-8fd8c9311475","Type":"ContainerDied","Data":"e68be7bfb795edc45c2a960b9d954fe9c1136dd9361a3c072827abb06f40bf86"} Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.879669 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-plfh6" event={"ID":"e87eb546-663e-4f10-bedb-6101fd30b384","Type":"ContainerStarted","Data":"081a9a611599be9376db7c437a86112cc79f8eafe0f55048de6ae34d8e5f3f0f"} Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.879716 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-plfh6" event={"ID":"e87eb546-663e-4f10-bedb-6101fd30b384","Type":"ContainerStarted","Data":"d70ea0b8c8d761f89f7c0930ee13e0d684a13f3c5e7772fb18d2b3c25197d657"} Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.895103 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e531d8d-5769-4736-b8cd-0121cc087e2e","Type":"ContainerStarted","Data":"58287914ddd70e30acbf7b2f02ccd87176266e94a31de0d0f60f84e4e99d4e37"} Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.904362 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-plfh6" podStartSLOduration=1.9043434019999999 podStartE2EDuration="1.904343402s" podCreationTimestamp="2025-10-04 02:56:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:56:32.904201668 +0000 UTC m=+972.801160306" watchObservedRunningTime="2025-10-04 02:56:32.904343402 +0000 UTC m=+972.801302040" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.909717 4964 generic.go:334] "Generic (PLEG): container finished" podID="81b5818e-cfbf-49bc-995f-b7024d46b020" containerID="ae4aa1114a4130bec97e3f870bdf46e359790cb91d51b471447033f7b62ec2ca" exitCode=0 Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.909811 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0b5c-account-create-nwmgs" event={"ID":"81b5818e-cfbf-49bc-995f-b7024d46b020","Type":"ContainerDied","Data":"ae4aa1114a4130bec97e3f870bdf46e359790cb91d51b471447033f7b62ec2ca"} Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.928317 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-sxcq5"] Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.932291 4964 scope.go:117] "RemoveContainer" containerID="8378bfa5c5e367ce84b59512ca36d46a434bfbbf024d508c4e8f2ced165bdc53" Oct 04 02:56:32 crc kubenswrapper[4964]: I1004 02:56:32.937044 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-sxcq5"] Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.250847 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4f8a-account-create-flfbn" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.299558 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.309584 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6c4d-account-create-rv5nz" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.346561 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-ovsdbserver-nb\") pod \"fe3803ca-5503-427a-8c0c-c93684d02f5d\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.346978 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-config\") pod \"fe3803ca-5503-427a-8c0c-c93684d02f5d\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.347131 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-ovsdbserver-sb\") pod \"fe3803ca-5503-427a-8c0c-c93684d02f5d\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.347213 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94c2g\" (UniqueName: \"kubernetes.io/projected/e2fdf522-a1fc-4ce6-a819-533ac61f5a5d-kube-api-access-94c2g\") pod \"e2fdf522-a1fc-4ce6-a819-533ac61f5a5d\" (UID: \"e2fdf522-a1fc-4ce6-a819-533ac61f5a5d\") " Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.347282 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-dns-svc\") pod \"fe3803ca-5503-427a-8c0c-c93684d02f5d\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.347322 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb6mr\" (UniqueName: \"kubernetes.io/projected/26dddf74-bd10-4534-9c12-8fd8c9311475-kube-api-access-xb6mr\") pod \"26dddf74-bd10-4534-9c12-8fd8c9311475\" (UID: \"26dddf74-bd10-4534-9c12-8fd8c9311475\") " Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.347374 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbrs5\" (UniqueName: \"kubernetes.io/projected/fe3803ca-5503-427a-8c0c-c93684d02f5d-kube-api-access-vbrs5\") pod \"fe3803ca-5503-427a-8c0c-c93684d02f5d\" (UID: \"fe3803ca-5503-427a-8c0c-c93684d02f5d\") " Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.352455 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2fdf522-a1fc-4ce6-a819-533ac61f5a5d-kube-api-access-94c2g" (OuterVolumeSpecName: "kube-api-access-94c2g") pod "e2fdf522-a1fc-4ce6-a819-533ac61f5a5d" (UID: "e2fdf522-a1fc-4ce6-a819-533ac61f5a5d"). InnerVolumeSpecName "kube-api-access-94c2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.352741 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe3803ca-5503-427a-8c0c-c93684d02f5d-kube-api-access-vbrs5" (OuterVolumeSpecName: "kube-api-access-vbrs5") pod "fe3803ca-5503-427a-8c0c-c93684d02f5d" (UID: "fe3803ca-5503-427a-8c0c-c93684d02f5d"). InnerVolumeSpecName "kube-api-access-vbrs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.356877 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26dddf74-bd10-4534-9c12-8fd8c9311475-kube-api-access-xb6mr" (OuterVolumeSpecName: "kube-api-access-xb6mr") pod "26dddf74-bd10-4534-9c12-8fd8c9311475" (UID: "26dddf74-bd10-4534-9c12-8fd8c9311475"). InnerVolumeSpecName "kube-api-access-xb6mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.372491 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe3803ca-5503-427a-8c0c-c93684d02f5d" (UID: "fe3803ca-5503-427a-8c0c-c93684d02f5d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.383669 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe3803ca-5503-427a-8c0c-c93684d02f5d" (UID: "fe3803ca-5503-427a-8c0c-c93684d02f5d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.385002 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-config" (OuterVolumeSpecName: "config") pod "fe3803ca-5503-427a-8c0c-c93684d02f5d" (UID: "fe3803ca-5503-427a-8c0c-c93684d02f5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.407819 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe3803ca-5503-427a-8c0c-c93684d02f5d" (UID: "fe3803ca-5503-427a-8c0c-c93684d02f5d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.449982 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb6mr\" (UniqueName: \"kubernetes.io/projected/26dddf74-bd10-4534-9c12-8fd8c9311475-kube-api-access-xb6mr\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.450015 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbrs5\" (UniqueName: \"kubernetes.io/projected/fe3803ca-5503-427a-8c0c-c93684d02f5d-kube-api-access-vbrs5\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.450024 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.450034 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.450042 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.450049 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94c2g\" (UniqueName: \"kubernetes.io/projected/e2fdf522-a1fc-4ce6-a819-533ac61f5a5d-kube-api-access-94c2g\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.450057 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe3803ca-5503-427a-8c0c-c93684d02f5d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.512415 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.920207 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6c4d-account-create-rv5nz" event={"ID":"e2fdf522-a1fc-4ce6-a819-533ac61f5a5d","Type":"ContainerDied","Data":"2133608917d485ffa36e64778d7c1b0048a81fd20cf628dfa81259c5d58b717a"} Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.920243 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2133608917d485ffa36e64778d7c1b0048a81fd20cf628dfa81259c5d58b717a" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.920269 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6c4d-account-create-rv5nz" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.924193 4964 generic.go:334] "Generic (PLEG): container finished" podID="488b1daa-7288-46cf-b351-2d6cec22c917" containerID="00aa6bd9d7a2c0272b6fe574f963789ae848ddbd2238276dc716ac6519baa521" exitCode=0 Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.924314 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" event={"ID":"488b1daa-7288-46cf-b351-2d6cec22c917","Type":"ContainerDied","Data":"00aa6bd9d7a2c0272b6fe574f963789ae848ddbd2238276dc716ac6519baa521"} Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.929046 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.929634 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-lmjwf" event={"ID":"fe3803ca-5503-427a-8c0c-c93684d02f5d","Type":"ContainerDied","Data":"9908d96e337d4e75dba9500269e6d6b006b5cde209f6ec17b7c0ac17ecd8b647"} Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.929690 4964 scope.go:117] "RemoveContainer" containerID="37477af2f7b2d41c2c0a431f3768c6b2d3f6ee110c95bb7b2fe8cd0232febde2" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.939195 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4f8a-account-create-flfbn" Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.939224 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4f8a-account-create-flfbn" event={"ID":"26dddf74-bd10-4534-9c12-8fd8c9311475","Type":"ContainerDied","Data":"d50549c39cdf2b9a67d3be712e517bd7b8d7864218c34c20f8c65fb9aa85036a"} Oct 04 02:56:33 crc kubenswrapper[4964]: I1004 02:56:33.939256 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d50549c39cdf2b9a67d3be712e517bd7b8d7864218c34c20f8c65fb9aa85036a" Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.157682 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-lmjwf"] Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.178253 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-lmjwf"] Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.326443 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0b5c-account-create-nwmgs" Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.448826 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.448885 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.448927 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.449535 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"008a6133f8963f8c25283a4615f3f65b17e14a1929e0bda2e812a4ec5ec09c24"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.449591 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://008a6133f8963f8c25283a4615f3f65b17e14a1929e0bda2e812a4ec5ec09c24" gracePeriod=600 Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.471045 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-975kc\" (UniqueName: \"kubernetes.io/projected/81b5818e-cfbf-49bc-995f-b7024d46b020-kube-api-access-975kc\") pod \"81b5818e-cfbf-49bc-995f-b7024d46b020\" (UID: \"81b5818e-cfbf-49bc-995f-b7024d46b020\") " Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.475114 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81b5818e-cfbf-49bc-995f-b7024d46b020-kube-api-access-975kc" (OuterVolumeSpecName: "kube-api-access-975kc") pod "81b5818e-cfbf-49bc-995f-b7024d46b020" (UID: "81b5818e-cfbf-49bc-995f-b7024d46b020"). InnerVolumeSpecName "kube-api-access-975kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.574399 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-975kc\" (UniqueName: \"kubernetes.io/projected/81b5818e-cfbf-49bc-995f-b7024d46b020-kube-api-access-975kc\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.860558 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="868df0b1-4478-408f-9f71-b829873b6ca9" path="/var/lib/kubelet/pods/868df0b1-4478-408f-9f71-b829873b6ca9/volumes" Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.862274 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe3803ca-5503-427a-8c0c-c93684d02f5d" path="/var/lib/kubelet/pods/fe3803ca-5503-427a-8c0c-c93684d02f5d/volumes" Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.973602 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="008a6133f8963f8c25283a4615f3f65b17e14a1929e0bda2e812a4ec5ec09c24" exitCode=0 Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.973642 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"008a6133f8963f8c25283a4615f3f65b17e14a1929e0bda2e812a4ec5ec09c24"} Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.973686 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"be21f7ce532c3d058512254656f890799806a56eaaee57c83d963d0c90820139"} Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.973704 4964 scope.go:117] "RemoveContainer" containerID="6efe9a8f74bbf3944c47eb916499cc67675937487bfe4fb926abac0853174b18" Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.976474 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0b5c-account-create-nwmgs" event={"ID":"81b5818e-cfbf-49bc-995f-b7024d46b020","Type":"ContainerDied","Data":"a2ba8bb1c9c3e8bfd05fee5ae7f50380fa2d1acabace230a59efdbddbeca6140"} Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.976534 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2ba8bb1c9c3e8bfd05fee5ae7f50380fa2d1acabace230a59efdbddbeca6140" Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.976498 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0b5c-account-create-nwmgs" Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.981497 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" event={"ID":"488b1daa-7288-46cf-b351-2d6cec22c917","Type":"ContainerStarted","Data":"4e914fdd841b439ed4574800421d4923941a5c357d0ebcaad3d08e9e4a9e3059"} Oct 04 02:56:34 crc kubenswrapper[4964]: I1004 02:56:34.981717 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:35 crc kubenswrapper[4964]: E1004 02:56:35.580262 4964 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode87eb546_663e_4f10_bedb_6101fd30b384.slice/crio-conmon-081a9a611599be9376db7c437a86112cc79f8eafe0f55048de6ae34d8e5f3f0f.scope\": RecentStats: unable to find data in memory cache]" Oct 04 02:56:35 crc kubenswrapper[4964]: I1004 02:56:35.993157 4964 generic.go:334] "Generic (PLEG): container finished" podID="e87eb546-663e-4f10-bedb-6101fd30b384" containerID="081a9a611599be9376db7c437a86112cc79f8eafe0f55048de6ae34d8e5f3f0f" exitCode=0 Oct 04 02:56:35 crc kubenswrapper[4964]: I1004 02:56:35.993225 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-plfh6" event={"ID":"e87eb546-663e-4f10-bedb-6101fd30b384","Type":"ContainerDied","Data":"081a9a611599be9376db7c437a86112cc79f8eafe0f55048de6ae34d8e5f3f0f"} Oct 04 02:56:36 crc kubenswrapper[4964]: I1004 02:56:36.018138 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" podStartSLOduration=5.018110365 podStartE2EDuration="5.018110365s" podCreationTimestamp="2025-10-04 02:56:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:56:35.012796347 +0000 UTC m=+974.909755005" watchObservedRunningTime="2025-10-04 02:56:36.018110365 +0000 UTC m=+975.915069013" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.097532 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-c8kkp"] Oct 04 02:56:38 crc kubenswrapper[4964]: E1004 02:56:38.098249 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868df0b1-4478-408f-9f71-b829873b6ca9" containerName="init" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.098261 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="868df0b1-4478-408f-9f71-b829873b6ca9" containerName="init" Oct 04 02:56:38 crc kubenswrapper[4964]: E1004 02:56:38.098273 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3803ca-5503-427a-8c0c-c93684d02f5d" containerName="init" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.098279 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3803ca-5503-427a-8c0c-c93684d02f5d" containerName="init" Oct 04 02:56:38 crc kubenswrapper[4964]: E1004 02:56:38.098287 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868df0b1-4478-408f-9f71-b829873b6ca9" containerName="dnsmasq-dns" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.098294 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="868df0b1-4478-408f-9f71-b829873b6ca9" containerName="dnsmasq-dns" Oct 04 02:56:38 crc kubenswrapper[4964]: E1004 02:56:38.098308 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2fdf522-a1fc-4ce6-a819-533ac61f5a5d" containerName="mariadb-account-create" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.098314 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2fdf522-a1fc-4ce6-a819-533ac61f5a5d" containerName="mariadb-account-create" Oct 04 02:56:38 crc kubenswrapper[4964]: E1004 02:56:38.098325 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b5818e-cfbf-49bc-995f-b7024d46b020" containerName="mariadb-account-create" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.098330 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b5818e-cfbf-49bc-995f-b7024d46b020" containerName="mariadb-account-create" Oct 04 02:56:38 crc kubenswrapper[4964]: E1004 02:56:38.098338 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26dddf74-bd10-4534-9c12-8fd8c9311475" containerName="mariadb-account-create" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.098344 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="26dddf74-bd10-4534-9c12-8fd8c9311475" containerName="mariadb-account-create" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.098491 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="26dddf74-bd10-4534-9c12-8fd8c9311475" containerName="mariadb-account-create" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.098498 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b5818e-cfbf-49bc-995f-b7024d46b020" containerName="mariadb-account-create" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.098511 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2fdf522-a1fc-4ce6-a819-533ac61f5a5d" containerName="mariadb-account-create" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.098519 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="868df0b1-4478-408f-9f71-b829873b6ca9" containerName="dnsmasq-dns" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.098529 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe3803ca-5503-427a-8c0c-c93684d02f5d" containerName="init" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.099014 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.105191 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6fpbz" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.106491 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.110233 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.115829 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-c8kkp"] Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.239592 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-combined-ca-bundle\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.239728 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-db-sync-config-data\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.239750 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-scripts\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.239770 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvxgp\" (UniqueName: \"kubernetes.io/projected/969137a9-7e00-4472-8582-8008c5647750-kube-api-access-pvxgp\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.239794 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/969137a9-7e00-4472-8582-8008c5647750-etc-machine-id\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.239812 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-config-data\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.341777 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-combined-ca-bundle\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.341894 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-db-sync-config-data\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.341918 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-scripts\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.341943 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvxgp\" (UniqueName: \"kubernetes.io/projected/969137a9-7e00-4472-8582-8008c5647750-kube-api-access-pvxgp\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.341971 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/969137a9-7e00-4472-8582-8008c5647750-etc-machine-id\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.342081 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/969137a9-7e00-4472-8582-8008c5647750-etc-machine-id\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.342382 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-config-data\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.348366 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-bwnzx"] Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.349687 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.352726 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-config-data\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.354351 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-szd9g" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.354767 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.355955 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-db-sync-config-data\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.361764 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bwnzx"] Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.363628 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-combined-ca-bundle\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.371234 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvxgp\" (UniqueName: \"kubernetes.io/projected/969137a9-7e00-4472-8582-8008c5647750-kube-api-access-pvxgp\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.384989 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-scripts\") pod \"cinder-db-sync-c8kkp\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.415901 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.498087 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8sd9c"] Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.499175 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.500874 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.505774 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.506040 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8sd9c"] Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.506654 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-g7kwd" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.546531 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnl4g\" (UniqueName: \"kubernetes.io/projected/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-kube-api-access-xnl4g\") pod \"barbican-db-sync-bwnzx\" (UID: \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\") " pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.546728 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-db-sync-config-data\") pod \"barbican-db-sync-bwnzx\" (UID: \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\") " pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.546795 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-combined-ca-bundle\") pod \"barbican-db-sync-bwnzx\" (UID: \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\") " pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.648645 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481d6463-07ac-427e-b5f6-f85143ebf2e0-combined-ca-bundle\") pod \"neutron-db-sync-8sd9c\" (UID: \"481d6463-07ac-427e-b5f6-f85143ebf2e0\") " pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.648703 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-db-sync-config-data\") pod \"barbican-db-sync-bwnzx\" (UID: \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\") " pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.648758 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-combined-ca-bundle\") pod \"barbican-db-sync-bwnzx\" (UID: \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\") " pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.648785 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmgbs\" (UniqueName: \"kubernetes.io/projected/481d6463-07ac-427e-b5f6-f85143ebf2e0-kube-api-access-fmgbs\") pod \"neutron-db-sync-8sd9c\" (UID: \"481d6463-07ac-427e-b5f6-f85143ebf2e0\") " pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.648814 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/481d6463-07ac-427e-b5f6-f85143ebf2e0-config\") pod \"neutron-db-sync-8sd9c\" (UID: \"481d6463-07ac-427e-b5f6-f85143ebf2e0\") " pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.648912 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnl4g\" (UniqueName: \"kubernetes.io/projected/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-kube-api-access-xnl4g\") pod \"barbican-db-sync-bwnzx\" (UID: \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\") " pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.654187 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-db-sync-config-data\") pod \"barbican-db-sync-bwnzx\" (UID: \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\") " pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.654222 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-combined-ca-bundle\") pod \"barbican-db-sync-bwnzx\" (UID: \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\") " pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.666896 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnl4g\" (UniqueName: \"kubernetes.io/projected/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-kube-api-access-xnl4g\") pod \"barbican-db-sync-bwnzx\" (UID: \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\") " pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.740161 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.750211 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmgbs\" (UniqueName: \"kubernetes.io/projected/481d6463-07ac-427e-b5f6-f85143ebf2e0-kube-api-access-fmgbs\") pod \"neutron-db-sync-8sd9c\" (UID: \"481d6463-07ac-427e-b5f6-f85143ebf2e0\") " pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.750281 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/481d6463-07ac-427e-b5f6-f85143ebf2e0-config\") pod \"neutron-db-sync-8sd9c\" (UID: \"481d6463-07ac-427e-b5f6-f85143ebf2e0\") " pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.750461 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481d6463-07ac-427e-b5f6-f85143ebf2e0-combined-ca-bundle\") pod \"neutron-db-sync-8sd9c\" (UID: \"481d6463-07ac-427e-b5f6-f85143ebf2e0\") " pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.754857 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/481d6463-07ac-427e-b5f6-f85143ebf2e0-config\") pod \"neutron-db-sync-8sd9c\" (UID: \"481d6463-07ac-427e-b5f6-f85143ebf2e0\") " pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.755028 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481d6463-07ac-427e-b5f6-f85143ebf2e0-combined-ca-bundle\") pod \"neutron-db-sync-8sd9c\" (UID: \"481d6463-07ac-427e-b5f6-f85143ebf2e0\") " pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.764809 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmgbs\" (UniqueName: \"kubernetes.io/projected/481d6463-07ac-427e-b5f6-f85143ebf2e0-kube-api-access-fmgbs\") pod \"neutron-db-sync-8sd9c\" (UID: \"481d6463-07ac-427e-b5f6-f85143ebf2e0\") " pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:56:38 crc kubenswrapper[4964]: I1004 02:56:38.812494 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.331510 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.462567 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-credential-keys\") pod \"e87eb546-663e-4f10-bedb-6101fd30b384\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.462657 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-scripts\") pod \"e87eb546-663e-4f10-bedb-6101fd30b384\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.463833 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5fqz\" (UniqueName: \"kubernetes.io/projected/e87eb546-663e-4f10-bedb-6101fd30b384-kube-api-access-m5fqz\") pod \"e87eb546-663e-4f10-bedb-6101fd30b384\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.464384 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-fernet-keys\") pod \"e87eb546-663e-4f10-bedb-6101fd30b384\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.464936 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-combined-ca-bundle\") pod \"e87eb546-663e-4f10-bedb-6101fd30b384\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.464989 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-config-data\") pod \"e87eb546-663e-4f10-bedb-6101fd30b384\" (UID: \"e87eb546-663e-4f10-bedb-6101fd30b384\") " Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.472530 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e87eb546-663e-4f10-bedb-6101fd30b384" (UID: "e87eb546-663e-4f10-bedb-6101fd30b384"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.483915 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-scripts" (OuterVolumeSpecName: "scripts") pod "e87eb546-663e-4f10-bedb-6101fd30b384" (UID: "e87eb546-663e-4f10-bedb-6101fd30b384"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.485823 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e87eb546-663e-4f10-bedb-6101fd30b384" (UID: "e87eb546-663e-4f10-bedb-6101fd30b384"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.486407 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e87eb546-663e-4f10-bedb-6101fd30b384-kube-api-access-m5fqz" (OuterVolumeSpecName: "kube-api-access-m5fqz") pod "e87eb546-663e-4f10-bedb-6101fd30b384" (UID: "e87eb546-663e-4f10-bedb-6101fd30b384"). InnerVolumeSpecName "kube-api-access-m5fqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.512971 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-config-data" (OuterVolumeSpecName: "config-data") pod "e87eb546-663e-4f10-bedb-6101fd30b384" (UID: "e87eb546-663e-4f10-bedb-6101fd30b384"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.520762 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e87eb546-663e-4f10-bedb-6101fd30b384" (UID: "e87eb546-663e-4f10-bedb-6101fd30b384"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.566497 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.566530 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.566542 4964 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.566552 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.566563 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5fqz\" (UniqueName: \"kubernetes.io/projected/e87eb546-663e-4f10-bedb-6101fd30b384-kube-api-access-m5fqz\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.566572 4964 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e87eb546-663e-4f10-bedb-6101fd30b384-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.925479 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-c8kkp"] Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.937397 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8sd9c"] Oct 04 02:56:39 crc kubenswrapper[4964]: W1004 02:56:39.939373 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod969137a9_7e00_4472_8582_8008c5647750.slice/crio-e699e02e3c2b164afc9b7797efb1349a3e100416edf93524b4369fab0716a0ee WatchSource:0}: Error finding container e699e02e3c2b164afc9b7797efb1349a3e100416edf93524b4369fab0716a0ee: Status 404 returned error can't find the container with id e699e02e3c2b164afc9b7797efb1349a3e100416edf93524b4369fab0716a0ee Oct 04 02:56:39 crc kubenswrapper[4964]: W1004 02:56:39.941108 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79e6a68d_94f3_4485_9bbd_eccc7a9398d2.slice/crio-ace31cdbbbf7e1ecb8fb831eed6ca99795a03b94d23551896d82f6334e2fd541 WatchSource:0}: Error finding container ace31cdbbbf7e1ecb8fb831eed6ca99795a03b94d23551896d82f6334e2fd541: Status 404 returned error can't find the container with id ace31cdbbbf7e1ecb8fb831eed6ca99795a03b94d23551896d82f6334e2fd541 Oct 04 02:56:39 crc kubenswrapper[4964]: I1004 02:56:39.942959 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bwnzx"] Oct 04 02:56:39 crc kubenswrapper[4964]: W1004 02:56:39.943058 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481d6463_07ac_427e_b5f6_f85143ebf2e0.slice/crio-49e336d15ae5de384ad8ce8205d535cd8632105f46da1b75d2306bcf9590ef17 WatchSource:0}: Error finding container 49e336d15ae5de384ad8ce8205d535cd8632105f46da1b75d2306bcf9590ef17: Status 404 returned error can't find the container with id 49e336d15ae5de384ad8ce8205d535cd8632105f46da1b75d2306bcf9590ef17 Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.034154 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-c8kkp" event={"ID":"969137a9-7e00-4472-8582-8008c5647750","Type":"ContainerStarted","Data":"e699e02e3c2b164afc9b7797efb1349a3e100416edf93524b4369fab0716a0ee"} Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.035701 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nl7sp" event={"ID":"da802d94-8d77-4b2c-88a0-3edc6e7c115b","Type":"ContainerStarted","Data":"039b184bc3f8d943465149fe94df5806f19d5ecaa5316c9422a4b54ff08f2756"} Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.037353 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-plfh6" event={"ID":"e87eb546-663e-4f10-bedb-6101fd30b384","Type":"ContainerDied","Data":"d70ea0b8c8d761f89f7c0930ee13e0d684a13f3c5e7772fb18d2b3c25197d657"} Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.037383 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d70ea0b8c8d761f89f7c0930ee13e0d684a13f3c5e7772fb18d2b3c25197d657" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.037360 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-plfh6" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.041703 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e531d8d-5769-4736-b8cd-0121cc087e2e","Type":"ContainerStarted","Data":"be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73"} Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.051139 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8sd9c" event={"ID":"481d6463-07ac-427e-b5f6-f85143ebf2e0","Type":"ContainerStarted","Data":"49e336d15ae5de384ad8ce8205d535cd8632105f46da1b75d2306bcf9590ef17"} Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.056428 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-nl7sp" podStartSLOduration=2.178216503 podStartE2EDuration="9.056412211s" podCreationTimestamp="2025-10-04 02:56:31 +0000 UTC" firstStartedPulling="2025-10-04 02:56:32.51529592 +0000 UTC m=+972.412254558" lastFinishedPulling="2025-10-04 02:56:39.393491618 +0000 UTC m=+979.290450266" observedRunningTime="2025-10-04 02:56:40.050542429 +0000 UTC m=+979.947501067" watchObservedRunningTime="2025-10-04 02:56:40.056412211 +0000 UTC m=+979.953370849" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.059905 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bwnzx" event={"ID":"79e6a68d-94f3-4485-9bbd-eccc7a9398d2","Type":"ContainerStarted","Data":"ace31cdbbbf7e1ecb8fb831eed6ca99795a03b94d23551896d82f6334e2fd541"} Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.430002 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-plfh6"] Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.435067 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-plfh6"] Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.515083 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-z4k2k"] Oct 04 02:56:40 crc kubenswrapper[4964]: E1004 02:56:40.515484 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e87eb546-663e-4f10-bedb-6101fd30b384" containerName="keystone-bootstrap" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.515502 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e87eb546-663e-4f10-bedb-6101fd30b384" containerName="keystone-bootstrap" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.515678 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="e87eb546-663e-4f10-bedb-6101fd30b384" containerName="keystone-bootstrap" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.516243 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.522328 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.522552 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mslgh" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.522761 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.523434 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.530933 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-z4k2k"] Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.597407 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-scripts\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.597869 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-credential-keys\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.597896 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-combined-ca-bundle\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.597952 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-config-data\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.597968 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-fernet-keys\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.597988 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwrjz\" (UniqueName: \"kubernetes.io/projected/4dda8845-8294-4367-a5e7-055b6e6711a3-kube-api-access-kwrjz\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.698798 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-scripts\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.698907 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-credential-keys\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.698934 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-combined-ca-bundle\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.699028 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-config-data\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.699053 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-fernet-keys\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.699085 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwrjz\" (UniqueName: \"kubernetes.io/projected/4dda8845-8294-4367-a5e7-055b6e6711a3-kube-api-access-kwrjz\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.705160 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-combined-ca-bundle\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.706087 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-credential-keys\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.706506 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-scripts\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.710963 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-fernet-keys\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.718130 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-config-data\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.723936 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwrjz\" (UniqueName: \"kubernetes.io/projected/4dda8845-8294-4367-a5e7-055b6e6711a3-kube-api-access-kwrjz\") pod \"keystone-bootstrap-z4k2k\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.841576 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:40 crc kubenswrapper[4964]: I1004 02:56:40.862485 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e87eb546-663e-4f10-bedb-6101fd30b384" path="/var/lib/kubelet/pods/e87eb546-663e-4f10-bedb-6101fd30b384/volumes" Oct 04 02:56:41 crc kubenswrapper[4964]: I1004 02:56:41.079700 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8sd9c" event={"ID":"481d6463-07ac-427e-b5f6-f85143ebf2e0","Type":"ContainerStarted","Data":"439ef8ae51fee2e8b76d9e1c18c5d37ab3626298dffab2ced570b3a0a35acd72"} Oct 04 02:56:41 crc kubenswrapper[4964]: I1004 02:56:41.096910 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-8sd9c" podStartSLOduration=3.096894829 podStartE2EDuration="3.096894829s" podCreationTimestamp="2025-10-04 02:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:56:41.094257005 +0000 UTC m=+980.991215643" watchObservedRunningTime="2025-10-04 02:56:41.096894829 +0000 UTC m=+980.993853467" Oct 04 02:56:41 crc kubenswrapper[4964]: I1004 02:56:41.448164 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-z4k2k"] Oct 04 02:56:41 crc kubenswrapper[4964]: W1004 02:56:41.600457 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dda8845_8294_4367_a5e7_055b6e6711a3.slice/crio-a0d8b1b635ba3eea98ae1578cd67ccf93f1b2afccdaf9b5dba82a250fdba3050 WatchSource:0}: Error finding container a0d8b1b635ba3eea98ae1578cd67ccf93f1b2afccdaf9b5dba82a250fdba3050: Status 404 returned error can't find the container with id a0d8b1b635ba3eea98ae1578cd67ccf93f1b2afccdaf9b5dba82a250fdba3050 Oct 04 02:56:42 crc kubenswrapper[4964]: I1004 02:56:42.095961 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z4k2k" event={"ID":"4dda8845-8294-4367-a5e7-055b6e6711a3","Type":"ContainerStarted","Data":"cf10e2cd976ae055f82275fbfd7b235ee1d866f99cfd501646f693bc6318d43e"} Oct 04 02:56:42 crc kubenswrapper[4964]: I1004 02:56:42.096219 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z4k2k" event={"ID":"4dda8845-8294-4367-a5e7-055b6e6711a3","Type":"ContainerStarted","Data":"a0d8b1b635ba3eea98ae1578cd67ccf93f1b2afccdaf9b5dba82a250fdba3050"} Oct 04 02:56:42 crc kubenswrapper[4964]: I1004 02:56:42.099322 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e531d8d-5769-4736-b8cd-0121cc087e2e","Type":"ContainerStarted","Data":"0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e"} Oct 04 02:56:42 crc kubenswrapper[4964]: I1004 02:56:42.102990 4964 generic.go:334] "Generic (PLEG): container finished" podID="da802d94-8d77-4b2c-88a0-3edc6e7c115b" containerID="039b184bc3f8d943465149fe94df5806f19d5ecaa5316c9422a4b54ff08f2756" exitCode=0 Oct 04 02:56:42 crc kubenswrapper[4964]: I1004 02:56:42.103692 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nl7sp" event={"ID":"da802d94-8d77-4b2c-88a0-3edc6e7c115b","Type":"ContainerDied","Data":"039b184bc3f8d943465149fe94df5806f19d5ecaa5316c9422a4b54ff08f2756"} Oct 04 02:56:42 crc kubenswrapper[4964]: I1004 02:56:42.117264 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-z4k2k" podStartSLOduration=2.11724915 podStartE2EDuration="2.11724915s" podCreationTimestamp="2025-10-04 02:56:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:56:42.114696729 +0000 UTC m=+982.011655377" watchObservedRunningTime="2025-10-04 02:56:42.11724915 +0000 UTC m=+982.014207788" Oct 04 02:56:42 crc kubenswrapper[4964]: I1004 02:56:42.288314 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:56:42 crc kubenswrapper[4964]: I1004 02:56:42.357885 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zgcfk"] Oct 04 02:56:42 crc kubenswrapper[4964]: I1004 02:56:42.358102 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" podUID="1e2229e9-e432-454f-8133-9d94f1a785a9" containerName="dnsmasq-dns" containerID="cri-o://d98a05a86de0c64a89db8bac4ed4c51a3b592c63391ed898921e437950d7f0e3" gracePeriod=10 Oct 04 02:56:43 crc kubenswrapper[4964]: I1004 02:56:43.112745 4964 generic.go:334] "Generic (PLEG): container finished" podID="1e2229e9-e432-454f-8133-9d94f1a785a9" containerID="d98a05a86de0c64a89db8bac4ed4c51a3b592c63391ed898921e437950d7f0e3" exitCode=0 Oct 04 02:56:43 crc kubenswrapper[4964]: I1004 02:56:43.112803 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" event={"ID":"1e2229e9-e432-454f-8133-9d94f1a785a9","Type":"ContainerDied","Data":"d98a05a86de0c64a89db8bac4ed4c51a3b592c63391ed898921e437950d7f0e3"} Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.634570 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.802740 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da802d94-8d77-4b2c-88a0-3edc6e7c115b-logs\") pod \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.802852 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz4gc\" (UniqueName: \"kubernetes.io/projected/da802d94-8d77-4b2c-88a0-3edc6e7c115b-kube-api-access-kz4gc\") pod \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.802903 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-combined-ca-bundle\") pod \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.802930 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-config-data\") pod \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.802953 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-scripts\") pod \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\" (UID: \"da802d94-8d77-4b2c-88a0-3edc6e7c115b\") " Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.803973 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da802d94-8d77-4b2c-88a0-3edc6e7c115b-logs" (OuterVolumeSpecName: "logs") pod "da802d94-8d77-4b2c-88a0-3edc6e7c115b" (UID: "da802d94-8d77-4b2c-88a0-3edc6e7c115b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.809383 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-scripts" (OuterVolumeSpecName: "scripts") pod "da802d94-8d77-4b2c-88a0-3edc6e7c115b" (UID: "da802d94-8d77-4b2c-88a0-3edc6e7c115b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.812393 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da802d94-8d77-4b2c-88a0-3edc6e7c115b-kube-api-access-kz4gc" (OuterVolumeSpecName: "kube-api-access-kz4gc") pod "da802d94-8d77-4b2c-88a0-3edc6e7c115b" (UID: "da802d94-8d77-4b2c-88a0-3edc6e7c115b"). InnerVolumeSpecName "kube-api-access-kz4gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.842769 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-config-data" (OuterVolumeSpecName: "config-data") pod "da802d94-8d77-4b2c-88a0-3edc6e7c115b" (UID: "da802d94-8d77-4b2c-88a0-3edc6e7c115b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.845355 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da802d94-8d77-4b2c-88a0-3edc6e7c115b" (UID: "da802d94-8d77-4b2c-88a0-3edc6e7c115b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.904198 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.904223 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da802d94-8d77-4b2c-88a0-3edc6e7c115b-logs\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.904232 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz4gc\" (UniqueName: \"kubernetes.io/projected/da802d94-8d77-4b2c-88a0-3edc6e7c115b-kube-api-access-kz4gc\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.904242 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:44 crc kubenswrapper[4964]: I1004 02:56:44.904251 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da802d94-8d77-4b2c-88a0-3edc6e7c115b-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.161308 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nl7sp" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.161982 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nl7sp" event={"ID":"da802d94-8d77-4b2c-88a0-3edc6e7c115b","Type":"ContainerDied","Data":"9100f6beb53e203a5c2ae1b872264bce26e90c5326471a67bba48a787255f18d"} Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.162008 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9100f6beb53e203a5c2ae1b872264bce26e90c5326471a67bba48a787255f18d" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.164852 4964 generic.go:334] "Generic (PLEG): container finished" podID="4dda8845-8294-4367-a5e7-055b6e6711a3" containerID="cf10e2cd976ae055f82275fbfd7b235ee1d866f99cfd501646f693bc6318d43e" exitCode=0 Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.164888 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z4k2k" event={"ID":"4dda8845-8294-4367-a5e7-055b6e6711a3","Type":"ContainerDied","Data":"cf10e2cd976ae055f82275fbfd7b235ee1d866f99cfd501646f693bc6318d43e"} Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.476257 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" podUID="1e2229e9-e432-454f-8133-9d94f1a785a9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.732851 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-855ccd47c4-qrtzn"] Oct 04 02:56:45 crc kubenswrapper[4964]: E1004 02:56:45.733708 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da802d94-8d77-4b2c-88a0-3edc6e7c115b" containerName="placement-db-sync" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.733722 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="da802d94-8d77-4b2c-88a0-3edc6e7c115b" containerName="placement-db-sync" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.733890 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="da802d94-8d77-4b2c-88a0-3edc6e7c115b" containerName="placement-db-sync" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.734864 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.736510 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.736677 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-96ccz" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.736982 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.738029 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.740104 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.744432 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-855ccd47c4-qrtzn"] Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.849864 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.862599 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-config-data\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.862676 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b18dfa90-2818-4164-a806-41cb55bb188c-logs\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.862701 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-internal-tls-certs\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.862736 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-scripts\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.862760 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wvjh\" (UniqueName: \"kubernetes.io/projected/b18dfa90-2818-4164-a806-41cb55bb188c-kube-api-access-2wvjh\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.862776 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-combined-ca-bundle\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.862804 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-public-tls-certs\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.964274 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-config\") pod \"1e2229e9-e432-454f-8133-9d94f1a785a9\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.964337 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-nb\") pod \"1e2229e9-e432-454f-8133-9d94f1a785a9\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.964462 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-sb\") pod \"1e2229e9-e432-454f-8133-9d94f1a785a9\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.964539 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-dns-svc\") pod \"1e2229e9-e432-454f-8133-9d94f1a785a9\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.964637 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnqjk\" (UniqueName: \"kubernetes.io/projected/1e2229e9-e432-454f-8133-9d94f1a785a9-kube-api-access-dnqjk\") pod \"1e2229e9-e432-454f-8133-9d94f1a785a9\" (UID: \"1e2229e9-e432-454f-8133-9d94f1a785a9\") " Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.964886 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-config-data\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.964916 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b18dfa90-2818-4164-a806-41cb55bb188c-logs\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.964942 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-internal-tls-certs\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.964974 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-scripts\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.965002 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wvjh\" (UniqueName: \"kubernetes.io/projected/b18dfa90-2818-4164-a806-41cb55bb188c-kube-api-access-2wvjh\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.965020 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-combined-ca-bundle\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.965061 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-public-tls-certs\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.966246 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b18dfa90-2818-4164-a806-41cb55bb188c-logs\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.969231 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e2229e9-e432-454f-8133-9d94f1a785a9-kube-api-access-dnqjk" (OuterVolumeSpecName: "kube-api-access-dnqjk") pod "1e2229e9-e432-454f-8133-9d94f1a785a9" (UID: "1e2229e9-e432-454f-8133-9d94f1a785a9"). InnerVolumeSpecName "kube-api-access-dnqjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.985082 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-internal-tls-certs\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.986604 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-combined-ca-bundle\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.986787 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-scripts\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.986792 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-config-data\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.987179 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wvjh\" (UniqueName: \"kubernetes.io/projected/b18dfa90-2818-4164-a806-41cb55bb188c-kube-api-access-2wvjh\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:45 crc kubenswrapper[4964]: I1004 02:56:45.987919 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18dfa90-2818-4164-a806-41cb55bb188c-public-tls-certs\") pod \"placement-855ccd47c4-qrtzn\" (UID: \"b18dfa90-2818-4164-a806-41cb55bb188c\") " pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.022666 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1e2229e9-e432-454f-8133-9d94f1a785a9" (UID: "1e2229e9-e432-454f-8133-9d94f1a785a9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.036069 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-config" (OuterVolumeSpecName: "config") pod "1e2229e9-e432-454f-8133-9d94f1a785a9" (UID: "1e2229e9-e432-454f-8133-9d94f1a785a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.036729 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1e2229e9-e432-454f-8133-9d94f1a785a9" (UID: "1e2229e9-e432-454f-8133-9d94f1a785a9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.052871 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1e2229e9-e432-454f-8133-9d94f1a785a9" (UID: "1e2229e9-e432-454f-8133-9d94f1a785a9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.066224 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnqjk\" (UniqueName: \"kubernetes.io/projected/1e2229e9-e432-454f-8133-9d94f1a785a9-kube-api-access-dnqjk\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.066253 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.066264 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.066276 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.066285 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e2229e9-e432-454f-8133-9d94f1a785a9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.149724 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.183272 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" event={"ID":"1e2229e9-e432-454f-8133-9d94f1a785a9","Type":"ContainerDied","Data":"e70d872d6c86a3cb39a33157572d260c7a351f7ed2e7798394d74efa31ef7e67"} Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.183345 4964 scope.go:117] "RemoveContainer" containerID="d98a05a86de0c64a89db8bac4ed4c51a3b592c63391ed898921e437950d7f0e3" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.183288 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zgcfk" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.187141 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bwnzx" event={"ID":"79e6a68d-94f3-4485-9bbd-eccc7a9398d2","Type":"ContainerStarted","Data":"5615a921688ae3cdb7bc4a565d9b1f2299a8d75acd232e359ce34b10078cabe3"} Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.204144 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-bwnzx" podStartSLOduration=2.606850874 podStartE2EDuration="8.204127399s" podCreationTimestamp="2025-10-04 02:56:38 +0000 UTC" firstStartedPulling="2025-10-04 02:56:39.946037893 +0000 UTC m=+979.842996521" lastFinishedPulling="2025-10-04 02:56:45.543314408 +0000 UTC m=+985.440273046" observedRunningTime="2025-10-04 02:56:46.202078789 +0000 UTC m=+986.099037427" watchObservedRunningTime="2025-10-04 02:56:46.204127399 +0000 UTC m=+986.101086037" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.226302 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zgcfk"] Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.227878 4964 scope.go:117] "RemoveContainer" containerID="a055067fe903878e90cc3a9242df9a82579ab70bb8ecc46d63a6c8be4b9d3d4c" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.234844 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zgcfk"] Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.510875 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.685287 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-scripts\") pod \"4dda8845-8294-4367-a5e7-055b6e6711a3\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.685857 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-combined-ca-bundle\") pod \"4dda8845-8294-4367-a5e7-055b6e6711a3\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.685902 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-config-data\") pod \"4dda8845-8294-4367-a5e7-055b6e6711a3\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.685963 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-credential-keys\") pod \"4dda8845-8294-4367-a5e7-055b6e6711a3\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.685984 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-fernet-keys\") pod \"4dda8845-8294-4367-a5e7-055b6e6711a3\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.686160 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwrjz\" (UniqueName: \"kubernetes.io/projected/4dda8845-8294-4367-a5e7-055b6e6711a3-kube-api-access-kwrjz\") pod \"4dda8845-8294-4367-a5e7-055b6e6711a3\" (UID: \"4dda8845-8294-4367-a5e7-055b6e6711a3\") " Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.690662 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-scripts" (OuterVolumeSpecName: "scripts") pod "4dda8845-8294-4367-a5e7-055b6e6711a3" (UID: "4dda8845-8294-4367-a5e7-055b6e6711a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.690917 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4dda8845-8294-4367-a5e7-055b6e6711a3" (UID: "4dda8845-8294-4367-a5e7-055b6e6711a3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.691711 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dda8845-8294-4367-a5e7-055b6e6711a3-kube-api-access-kwrjz" (OuterVolumeSpecName: "kube-api-access-kwrjz") pod "4dda8845-8294-4367-a5e7-055b6e6711a3" (UID: "4dda8845-8294-4367-a5e7-055b6e6711a3"). InnerVolumeSpecName "kube-api-access-kwrjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.709693 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4dda8845-8294-4367-a5e7-055b6e6711a3" (UID: "4dda8845-8294-4367-a5e7-055b6e6711a3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.710803 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-config-data" (OuterVolumeSpecName: "config-data") pod "4dda8845-8294-4367-a5e7-055b6e6711a3" (UID: "4dda8845-8294-4367-a5e7-055b6e6711a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.712815 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dda8845-8294-4367-a5e7-055b6e6711a3" (UID: "4dda8845-8294-4367-a5e7-055b6e6711a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.745634 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-855ccd47c4-qrtzn"] Oct 04 02:56:46 crc kubenswrapper[4964]: W1004 02:56:46.749576 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb18dfa90_2818_4164_a806_41cb55bb188c.slice/crio-ac185a23ad654bcc409af35fb5f19d53e6274d3b7bc33ea6cbefd9c8247e005a WatchSource:0}: Error finding container ac185a23ad654bcc409af35fb5f19d53e6274d3b7bc33ea6cbefd9c8247e005a: Status 404 returned error can't find the container with id ac185a23ad654bcc409af35fb5f19d53e6274d3b7bc33ea6cbefd9c8247e005a Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.787821 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwrjz\" (UniqueName: \"kubernetes.io/projected/4dda8845-8294-4367-a5e7-055b6e6711a3-kube-api-access-kwrjz\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.787845 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.787855 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.787865 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.787875 4964 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.787882 4964 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dda8845-8294-4367-a5e7-055b6e6711a3-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 04 02:56:46 crc kubenswrapper[4964]: I1004 02:56:46.853433 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e2229e9-e432-454f-8133-9d94f1a785a9" path="/var/lib/kubelet/pods/1e2229e9-e432-454f-8133-9d94f1a785a9/volumes" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.196587 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z4k2k" event={"ID":"4dda8845-8294-4367-a5e7-055b6e6711a3","Type":"ContainerDied","Data":"a0d8b1b635ba3eea98ae1578cd67ccf93f1b2afccdaf9b5dba82a250fdba3050"} Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.196906 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0d8b1b635ba3eea98ae1578cd67ccf93f1b2afccdaf9b5dba82a250fdba3050" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.196667 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z4k2k" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.200754 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-855ccd47c4-qrtzn" event={"ID":"b18dfa90-2818-4164-a806-41cb55bb188c","Type":"ContainerStarted","Data":"b2c29e2471656adfd6b3b71b0c70659784804101eb0b3ee6ea07c119d4fa9f01"} Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.200790 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-855ccd47c4-qrtzn" event={"ID":"b18dfa90-2818-4164-a806-41cb55bb188c","Type":"ContainerStarted","Data":"3a3bb8e844bdd89dc6675e97a17b2f8171ad97a5dfd4b549f585ea2a60f9a712"} Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.200805 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.200815 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.200823 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-855ccd47c4-qrtzn" event={"ID":"b18dfa90-2818-4164-a806-41cb55bb188c","Type":"ContainerStarted","Data":"ac185a23ad654bcc409af35fb5f19d53e6274d3b7bc33ea6cbefd9c8247e005a"} Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.225785 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-855ccd47c4-qrtzn" podStartSLOduration=2.225768122 podStartE2EDuration="2.225768122s" podCreationTimestamp="2025-10-04 02:56:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:56:47.218389824 +0000 UTC m=+987.115348462" watchObservedRunningTime="2025-10-04 02:56:47.225768122 +0000 UTC m=+987.122726760" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.365180 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-58576f76db-7mmj9"] Oct 04 02:56:47 crc kubenswrapper[4964]: E1004 02:56:47.367887 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e2229e9-e432-454f-8133-9d94f1a785a9" containerName="init" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.367976 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2229e9-e432-454f-8133-9d94f1a785a9" containerName="init" Oct 04 02:56:47 crc kubenswrapper[4964]: E1004 02:56:47.368071 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e2229e9-e432-454f-8133-9d94f1a785a9" containerName="dnsmasq-dns" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.368124 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2229e9-e432-454f-8133-9d94f1a785a9" containerName="dnsmasq-dns" Oct 04 02:56:47 crc kubenswrapper[4964]: E1004 02:56:47.368211 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dda8845-8294-4367-a5e7-055b6e6711a3" containerName="keystone-bootstrap" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.368261 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dda8845-8294-4367-a5e7-055b6e6711a3" containerName="keystone-bootstrap" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.377462 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dda8845-8294-4367-a5e7-055b6e6711a3" containerName="keystone-bootstrap" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.377698 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e2229e9-e432-454f-8133-9d94f1a785a9" containerName="dnsmasq-dns" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.378286 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-58576f76db-7mmj9"] Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.378450 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.382144 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.382290 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.382481 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.382588 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mslgh" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.382777 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.383087 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.498262 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-credential-keys\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.498331 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-combined-ca-bundle\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.498367 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-scripts\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.498411 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-public-tls-certs\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.498462 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-internal-tls-certs\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.498486 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kt6d\" (UniqueName: \"kubernetes.io/projected/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-kube-api-access-8kt6d\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.498566 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-config-data\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.498602 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-fernet-keys\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.600922 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-config-data\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.601002 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-fernet-keys\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.601077 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-credential-keys\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.601154 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-combined-ca-bundle\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.601186 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-scripts\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.601260 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-public-tls-certs\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.601337 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-internal-tls-certs\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.601359 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kt6d\" (UniqueName: \"kubernetes.io/projected/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-kube-api-access-8kt6d\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.607105 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-scripts\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.607347 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-public-tls-certs\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.607397 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-credential-keys\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.613422 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-internal-tls-certs\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.613398 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-combined-ca-bundle\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.614274 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-config-data\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.616756 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-fernet-keys\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.621393 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kt6d\" (UniqueName: \"kubernetes.io/projected/0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f-kube-api-access-8kt6d\") pod \"keystone-58576f76db-7mmj9\" (UID: \"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f\") " pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:47 crc kubenswrapper[4964]: I1004 02:56:47.697925 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:56:52 crc kubenswrapper[4964]: I1004 02:56:52.259290 4964 generic.go:334] "Generic (PLEG): container finished" podID="79e6a68d-94f3-4485-9bbd-eccc7a9398d2" containerID="5615a921688ae3cdb7bc4a565d9b1f2299a8d75acd232e359ce34b10078cabe3" exitCode=0 Oct 04 02:56:52 crc kubenswrapper[4964]: I1004 02:56:52.259482 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bwnzx" event={"ID":"79e6a68d-94f3-4485-9bbd-eccc7a9398d2","Type":"ContainerDied","Data":"5615a921688ae3cdb7bc4a565d9b1f2299a8d75acd232e359ce34b10078cabe3"} Oct 04 02:57:00 crc kubenswrapper[4964]: I1004 02:57:00.747207 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:57:00 crc kubenswrapper[4964]: I1004 02:57:00.871330 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-db-sync-config-data\") pod \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\" (UID: \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\") " Oct 04 02:57:00 crc kubenswrapper[4964]: I1004 02:57:00.871588 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnl4g\" (UniqueName: \"kubernetes.io/projected/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-kube-api-access-xnl4g\") pod \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\" (UID: \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\") " Oct 04 02:57:00 crc kubenswrapper[4964]: I1004 02:57:00.871689 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-combined-ca-bundle\") pod \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\" (UID: \"79e6a68d-94f3-4485-9bbd-eccc7a9398d2\") " Oct 04 02:57:00 crc kubenswrapper[4964]: I1004 02:57:00.878148 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-kube-api-access-xnl4g" (OuterVolumeSpecName: "kube-api-access-xnl4g") pod "79e6a68d-94f3-4485-9bbd-eccc7a9398d2" (UID: "79e6a68d-94f3-4485-9bbd-eccc7a9398d2"). InnerVolumeSpecName "kube-api-access-xnl4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:00 crc kubenswrapper[4964]: I1004 02:57:00.878386 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "79e6a68d-94f3-4485-9bbd-eccc7a9398d2" (UID: "79e6a68d-94f3-4485-9bbd-eccc7a9398d2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:00 crc kubenswrapper[4964]: I1004 02:57:00.905909 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79e6a68d-94f3-4485-9bbd-eccc7a9398d2" (UID: "79e6a68d-94f3-4485-9bbd-eccc7a9398d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:00 crc kubenswrapper[4964]: I1004 02:57:00.973843 4964 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:00 crc kubenswrapper[4964]: I1004 02:57:00.973874 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnl4g\" (UniqueName: \"kubernetes.io/projected/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-kube-api-access-xnl4g\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:00 crc kubenswrapper[4964]: I1004 02:57:00.973887 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e6a68d-94f3-4485-9bbd-eccc7a9398d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:01 crc kubenswrapper[4964]: I1004 02:57:01.338445 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bwnzx" event={"ID":"79e6a68d-94f3-4485-9bbd-eccc7a9398d2","Type":"ContainerDied","Data":"ace31cdbbbf7e1ecb8fb831eed6ca99795a03b94d23551896d82f6334e2fd541"} Oct 04 02:57:01 crc kubenswrapper[4964]: I1004 02:57:01.338467 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bwnzx" Oct 04 02:57:01 crc kubenswrapper[4964]: I1004 02:57:01.338480 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ace31cdbbbf7e1ecb8fb831eed6ca99795a03b94d23551896d82f6334e2fd541" Oct 04 02:57:01 crc kubenswrapper[4964]: E1004 02:57:01.968280 4964 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 04 02:57:01 crc kubenswrapper[4964]: E1004 02:57:01.968856 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pvxgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-c8kkp_openstack(969137a9-7e00-4472-8582-8008c5647750): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 02:57:01 crc kubenswrapper[4964]: E1004 02:57:01.984073 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-c8kkp" podUID="969137a9-7e00-4472-8582-8008c5647750" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.044322 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-968687b55-z5p6q"] Oct 04 02:57:02 crc kubenswrapper[4964]: E1004 02:57:02.047140 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e6a68d-94f3-4485-9bbd-eccc7a9398d2" containerName="barbican-db-sync" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.047163 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e6a68d-94f3-4485-9bbd-eccc7a9398d2" containerName="barbican-db-sync" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.047370 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e6a68d-94f3-4485-9bbd-eccc7a9398d2" containerName="barbican-db-sync" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.049451 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.056282 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-szd9g" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.056646 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.056783 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.071001 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-968687b55-z5p6q"] Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.092039 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6897cddb66-8b6jw"] Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.093499 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.099013 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.109690 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6897cddb66-8b6jw"] Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.114971 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-699df9757c-nv7b4"] Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.116523 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.128663 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-nv7b4"] Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.214810 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-config-data\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.214856 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-logs\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.214903 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-config-data-custom\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.214929 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-combined-ca-bundle\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.214982 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-config-data-custom\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.215006 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqnhj\" (UniqueName: \"kubernetes.io/projected/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-kube-api-access-hqnhj\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.215029 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-combined-ca-bundle\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.215044 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-logs\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.215058 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lrlk\" (UniqueName: \"kubernetes.io/projected/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-kube-api-access-6lrlk\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.215100 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-config-data\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.282337 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6fc588d58-bq7fk"] Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.287725 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.296814 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.311951 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fc588d58-bq7fk"] Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.320997 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-dns-svc\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321036 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-config-data\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321073 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321099 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-config-data\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321117 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-logs\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321137 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-config\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321159 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpzg5\" (UniqueName: \"kubernetes.io/projected/11b61102-db4d-41f3-975b-bed490026bf4-kube-api-access-fpzg5\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321183 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-config-data-custom\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321208 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-combined-ca-bundle\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321237 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321454 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-config-data-custom\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321477 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqnhj\" (UniqueName: \"kubernetes.io/projected/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-kube-api-access-hqnhj\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321497 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-combined-ca-bundle\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321513 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-logs\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.321529 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lrlk\" (UniqueName: \"kubernetes.io/projected/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-kube-api-access-6lrlk\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.323711 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-logs\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.330021 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-logs\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.331245 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-combined-ca-bundle\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.333161 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-config-data-custom\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.336591 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-config-data\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.346322 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-combined-ca-bundle\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.346803 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-config-data-custom\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.347504 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-config-data\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.370400 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lrlk\" (UniqueName: \"kubernetes.io/projected/2fdccf76-1497-4e20-bae6-0eecb8f80d2f-kube-api-access-6lrlk\") pod \"barbican-keystone-listener-6897cddb66-8b6jw\" (UID: \"2fdccf76-1497-4e20-bae6-0eecb8f80d2f\") " pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.386905 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqnhj\" (UniqueName: \"kubernetes.io/projected/8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0-kube-api-access-hqnhj\") pod \"barbican-worker-968687b55-z5p6q\" (UID: \"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0\") " pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.389805 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e531d8d-5769-4736-b8cd-0121cc087e2e","Type":"ContainerStarted","Data":"8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867"} Oct 04 02:57:02 crc kubenswrapper[4964]: E1004 02:57:02.391323 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-c8kkp" podUID="969137a9-7e00-4472-8582-8008c5647750" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.412494 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-968687b55-z5p6q" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.417715 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-58576f76db-7mmj9"] Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.422994 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.423511 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.423569 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-config\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.423593 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-config-data\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.423608 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpzg5\" (UniqueName: \"kubernetes.io/projected/11b61102-db4d-41f3-975b-bed490026bf4-kube-api-access-fpzg5\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.423653 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-logs\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.423698 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.423716 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-226vr\" (UniqueName: \"kubernetes.io/projected/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-kube-api-access-226vr\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.423744 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-config-data-custom\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.423764 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-combined-ca-bundle\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.424631 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.424635 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-config\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.424662 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.424710 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-dns-svc\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.425478 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-dns-svc\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.441507 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpzg5\" (UniqueName: \"kubernetes.io/projected/11b61102-db4d-41f3-975b-bed490026bf4-kube-api-access-fpzg5\") pod \"dnsmasq-dns-699df9757c-nv7b4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.527155 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-226vr\" (UniqueName: \"kubernetes.io/projected/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-kube-api-access-226vr\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.527217 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-config-data-custom\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.527244 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-combined-ca-bundle\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.527375 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-config-data\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.527400 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-logs\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.530259 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-logs\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.533220 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-config-data-custom\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.536017 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-config-data\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.536536 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-combined-ca-bundle\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.552419 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-226vr\" (UniqueName: \"kubernetes.io/projected/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-kube-api-access-226vr\") pod \"barbican-api-6fc588d58-bq7fk\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.621263 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.738164 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.910784 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-968687b55-z5p6q"] Oct 04 02:57:02 crc kubenswrapper[4964]: W1004 02:57:02.923747 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eb1b2cd_e6d5_4aed_99d6_45f992a68cf0.slice/crio-aa1ca3bfb76f6da15cdb1766ac923b39cfbbebe4f673fdea77b8985451080a8c WatchSource:0}: Error finding container aa1ca3bfb76f6da15cdb1766ac923b39cfbbebe4f673fdea77b8985451080a8c: Status 404 returned error can't find the container with id aa1ca3bfb76f6da15cdb1766ac923b39cfbbebe4f673fdea77b8985451080a8c Oct 04 02:57:02 crc kubenswrapper[4964]: I1004 02:57:02.977835 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6897cddb66-8b6jw"] Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.092489 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-nv7b4"] Oct 04 02:57:03 crc kubenswrapper[4964]: W1004 02:57:03.105630 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b61102_db4d_41f3_975b_bed490026bf4.slice/crio-af889d9956c875209170cde0d3d9909419fa89d707027853a73a736f75e10a05 WatchSource:0}: Error finding container af889d9956c875209170cde0d3d9909419fa89d707027853a73a736f75e10a05: Status 404 returned error can't find the container with id af889d9956c875209170cde0d3d9909419fa89d707027853a73a736f75e10a05 Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.192871 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fc588d58-bq7fk"] Oct 04 02:57:03 crc kubenswrapper[4964]: W1004 02:57:03.199844 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb5d6a7_b37d_4edb_8210_97e60614fc0e.slice/crio-018f7a39d862c27408401e014c2e5aea0065f507ae8406834b4b55d1409c25e8 WatchSource:0}: Error finding container 018f7a39d862c27408401e014c2e5aea0065f507ae8406834b4b55d1409c25e8: Status 404 returned error can't find the container with id 018f7a39d862c27408401e014c2e5aea0065f507ae8406834b4b55d1409c25e8 Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.398464 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-968687b55-z5p6q" event={"ID":"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0","Type":"ContainerStarted","Data":"aa1ca3bfb76f6da15cdb1766ac923b39cfbbebe4f673fdea77b8985451080a8c"} Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.400430 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fc588d58-bq7fk" event={"ID":"1cb5d6a7-b37d-4edb-8210-97e60614fc0e","Type":"ContainerStarted","Data":"55e60088f904e46be4b03ea661defb4b742408eb226fc01bc979281f208fb107"} Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.400453 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fc588d58-bq7fk" event={"ID":"1cb5d6a7-b37d-4edb-8210-97e60614fc0e","Type":"ContainerStarted","Data":"018f7a39d862c27408401e014c2e5aea0065f507ae8406834b4b55d1409c25e8"} Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.401694 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" event={"ID":"2fdccf76-1497-4e20-bae6-0eecb8f80d2f","Type":"ContainerStarted","Data":"98898077fbf40564714f96159bacf7aa4b6020c668f63d298bfd7de90f532f82"} Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.403343 4964 generic.go:334] "Generic (PLEG): container finished" podID="11b61102-db4d-41f3-975b-bed490026bf4" containerID="0381ea748c551878777ed429a2f538d1a9be67bdff7c3d71bf68dc6f0d19e89e" exitCode=0 Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.403393 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" event={"ID":"11b61102-db4d-41f3-975b-bed490026bf4","Type":"ContainerDied","Data":"0381ea748c551878777ed429a2f538d1a9be67bdff7c3d71bf68dc6f0d19e89e"} Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.403409 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" event={"ID":"11b61102-db4d-41f3-975b-bed490026bf4","Type":"ContainerStarted","Data":"af889d9956c875209170cde0d3d9909419fa89d707027853a73a736f75e10a05"} Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.406206 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-58576f76db-7mmj9" event={"ID":"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f","Type":"ContainerStarted","Data":"974512ce104a2b661b4e9973578709d199c42fcab776c54550afcf0a87ae2173"} Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.406384 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-58576f76db-7mmj9" event={"ID":"0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f","Type":"ContainerStarted","Data":"ba923d3cc6db1fd7f3bba5f51486b6a5f64f02304e4982ff8423a2257a570a6b"} Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.406876 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:57:03 crc kubenswrapper[4964]: I1004 02:57:03.460559 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-58576f76db-7mmj9" podStartSLOduration=16.460539055 podStartE2EDuration="16.460539055s" podCreationTimestamp="2025-10-04 02:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:57:03.456801475 +0000 UTC m=+1003.353760113" watchObservedRunningTime="2025-10-04 02:57:03.460539055 +0000 UTC m=+1003.357497703" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.430072 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fc588d58-bq7fk" event={"ID":"1cb5d6a7-b37d-4edb-8210-97e60614fc0e","Type":"ContainerStarted","Data":"3f6ad1590cdafd238253a80d892b42c7c494258e10f46bb4176ecb5113422e78"} Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.430332 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.430348 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.433240 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" event={"ID":"11b61102-db4d-41f3-975b-bed490026bf4","Type":"ContainerStarted","Data":"7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f"} Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.433398 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.445992 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6fc588d58-bq7fk" podStartSLOduration=2.445980073 podStartE2EDuration="2.445980073s" podCreationTimestamp="2025-10-04 02:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:57:04.444283312 +0000 UTC m=+1004.341241980" watchObservedRunningTime="2025-10-04 02:57:04.445980073 +0000 UTC m=+1004.342938711" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.479378 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" podStartSLOduration=2.47936206 podStartE2EDuration="2.47936206s" podCreationTimestamp="2025-10-04 02:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:57:04.474707037 +0000 UTC m=+1004.371665705" watchObservedRunningTime="2025-10-04 02:57:04.47936206 +0000 UTC m=+1004.376320698" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.823185 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56875d85c6-lvzcs"] Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.824765 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.826808 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.832535 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.842657 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56875d85c6-lvzcs"] Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.979114 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-combined-ca-bundle\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.979210 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76608f85-e25d-4e88-b6da-93c51f75eba8-logs\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.979269 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-config-data-custom\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.979286 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-internal-tls-certs\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.979347 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sqhw\" (UniqueName: \"kubernetes.io/projected/76608f85-e25d-4e88-b6da-93c51f75eba8-kube-api-access-5sqhw\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.979376 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-config-data\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:04 crc kubenswrapper[4964]: I1004 02:57:04.979403 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-public-tls-certs\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.080858 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76608f85-e25d-4e88-b6da-93c51f75eba8-logs\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.081149 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-config-data-custom\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.081250 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-internal-tls-certs\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.081367 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sqhw\" (UniqueName: \"kubernetes.io/projected/76608f85-e25d-4e88-b6da-93c51f75eba8-kube-api-access-5sqhw\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.081441 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76608f85-e25d-4e88-b6da-93c51f75eba8-logs\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.081515 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-config-data\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.081634 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-public-tls-certs\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.081754 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-combined-ca-bundle\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.086276 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-internal-tls-certs\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.086644 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-combined-ca-bundle\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.087946 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-config-data-custom\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.089004 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-public-tls-certs\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.089380 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76608f85-e25d-4e88-b6da-93c51f75eba8-config-data\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.102828 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sqhw\" (UniqueName: \"kubernetes.io/projected/76608f85-e25d-4e88-b6da-93c51f75eba8-kube-api-access-5sqhw\") pod \"barbican-api-56875d85c6-lvzcs\" (UID: \"76608f85-e25d-4e88-b6da-93c51f75eba8\") " pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.152205 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.415705 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56875d85c6-lvzcs"] Oct 04 02:57:05 crc kubenswrapper[4964]: W1004 02:57:05.420374 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76608f85_e25d_4e88_b6da_93c51f75eba8.slice/crio-a84602e48921826ffc7fa345b92f5ef433f1bd682f4e340759a3421e3d80db77 WatchSource:0}: Error finding container a84602e48921826ffc7fa345b92f5ef433f1bd682f4e340759a3421e3d80db77: Status 404 returned error can't find the container with id a84602e48921826ffc7fa345b92f5ef433f1bd682f4e340759a3421e3d80db77 Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.443730 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" event={"ID":"2fdccf76-1497-4e20-bae6-0eecb8f80d2f","Type":"ContainerStarted","Data":"89a543cf545680d4f63b5c4fc5abc09f1ed26d6c5602f8a8550e4063b1ee4c01"} Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.444834 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" event={"ID":"2fdccf76-1497-4e20-bae6-0eecb8f80d2f","Type":"ContainerStarted","Data":"4bc7413801bcb83f5016f40321dcd2226f97138a787e6e625404d2dd76ceb314"} Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.447783 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56875d85c6-lvzcs" event={"ID":"76608f85-e25d-4e88-b6da-93c51f75eba8","Type":"ContainerStarted","Data":"a84602e48921826ffc7fa345b92f5ef433f1bd682f4e340759a3421e3d80db77"} Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.455992 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-968687b55-z5p6q" event={"ID":"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0","Type":"ContainerStarted","Data":"7cf94e1a4c15940c25cc135defa692f02fa1e4f9d44d8cc33571be29ee6726c9"} Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.456015 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-968687b55-z5p6q" event={"ID":"8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0","Type":"ContainerStarted","Data":"4e1c1b7003d3f6b164f70383e2091d1f6cd4598b971159804a36565ff8443310"} Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.463732 4964 generic.go:334] "Generic (PLEG): container finished" podID="481d6463-07ac-427e-b5f6-f85143ebf2e0" containerID="439ef8ae51fee2e8b76d9e1c18c5d37ab3626298dffab2ced570b3a0a35acd72" exitCode=0 Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.463957 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8sd9c" event={"ID":"481d6463-07ac-427e-b5f6-f85143ebf2e0","Type":"ContainerDied","Data":"439ef8ae51fee2e8b76d9e1c18c5d37ab3626298dffab2ced570b3a0a35acd72"} Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.469490 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6897cddb66-8b6jw" podStartSLOduration=1.956827299 podStartE2EDuration="3.46946981s" podCreationTimestamp="2025-10-04 02:57:02 +0000 UTC" firstStartedPulling="2025-10-04 02:57:02.988168568 +0000 UTC m=+1002.885127196" lastFinishedPulling="2025-10-04 02:57:04.500811069 +0000 UTC m=+1004.397769707" observedRunningTime="2025-10-04 02:57:05.462701907 +0000 UTC m=+1005.359660565" watchObservedRunningTime="2025-10-04 02:57:05.46946981 +0000 UTC m=+1005.366428448" Oct 04 02:57:05 crc kubenswrapper[4964]: I1004 02:57:05.485951 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-968687b55-z5p6q" podStartSLOduration=2.915341497 podStartE2EDuration="4.485935288s" podCreationTimestamp="2025-10-04 02:57:01 +0000 UTC" firstStartedPulling="2025-10-04 02:57:02.928719941 +0000 UTC m=+1002.825678579" lastFinishedPulling="2025-10-04 02:57:04.499313732 +0000 UTC m=+1004.396272370" observedRunningTime="2025-10-04 02:57:05.477171317 +0000 UTC m=+1005.374129965" watchObservedRunningTime="2025-10-04 02:57:05.485935288 +0000 UTC m=+1005.382893926" Oct 04 02:57:06 crc kubenswrapper[4964]: I1004 02:57:06.487271 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56875d85c6-lvzcs" event={"ID":"76608f85-e25d-4e88-b6da-93c51f75eba8","Type":"ContainerStarted","Data":"b6c3d76a27f4ebc5525c4d97b2acbf7a51e1014cd5064ace8bf862445df1bafa"} Oct 04 02:57:06 crc kubenswrapper[4964]: I1004 02:57:06.487798 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56875d85c6-lvzcs" event={"ID":"76608f85-e25d-4e88-b6da-93c51f75eba8","Type":"ContainerStarted","Data":"0ebfce59ab9226c42bead714bf2e129d63d5b0cc8039fd6fc56e2dc386e90e9e"} Oct 04 02:57:06 crc kubenswrapper[4964]: I1004 02:57:06.489083 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:06 crc kubenswrapper[4964]: I1004 02:57:06.489146 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:06 crc kubenswrapper[4964]: I1004 02:57:06.526206 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56875d85c6-lvzcs" podStartSLOduration=2.526176181 podStartE2EDuration="2.526176181s" podCreationTimestamp="2025-10-04 02:57:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:57:06.518123307 +0000 UTC m=+1006.415081945" watchObservedRunningTime="2025-10-04 02:57:06.526176181 +0000 UTC m=+1006.423134819" Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.487535 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.501814 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481d6463-07ac-427e-b5f6-f85143ebf2e0-combined-ca-bundle\") pod \"481d6463-07ac-427e-b5f6-f85143ebf2e0\" (UID: \"481d6463-07ac-427e-b5f6-f85143ebf2e0\") " Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.501929 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmgbs\" (UniqueName: \"kubernetes.io/projected/481d6463-07ac-427e-b5f6-f85143ebf2e0-kube-api-access-fmgbs\") pod \"481d6463-07ac-427e-b5f6-f85143ebf2e0\" (UID: \"481d6463-07ac-427e-b5f6-f85143ebf2e0\") " Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.540966 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481d6463-07ac-427e-b5f6-f85143ebf2e0-kube-api-access-fmgbs" (OuterVolumeSpecName: "kube-api-access-fmgbs") pod "481d6463-07ac-427e-b5f6-f85143ebf2e0" (UID: "481d6463-07ac-427e-b5f6-f85143ebf2e0"). InnerVolumeSpecName "kube-api-access-fmgbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.553229 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8sd9c" event={"ID":"481d6463-07ac-427e-b5f6-f85143ebf2e0","Type":"ContainerDied","Data":"49e336d15ae5de384ad8ce8205d535cd8632105f46da1b75d2306bcf9590ef17"} Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.553272 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49e336d15ae5de384ad8ce8205d535cd8632105f46da1b75d2306bcf9590ef17" Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.553287 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8sd9c" Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.566868 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481d6463-07ac-427e-b5f6-f85143ebf2e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "481d6463-07ac-427e-b5f6-f85143ebf2e0" (UID: "481d6463-07ac-427e-b5f6-f85143ebf2e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.603386 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/481d6463-07ac-427e-b5f6-f85143ebf2e0-config\") pod \"481d6463-07ac-427e-b5f6-f85143ebf2e0\" (UID: \"481d6463-07ac-427e-b5f6-f85143ebf2e0\") " Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.603951 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481d6463-07ac-427e-b5f6-f85143ebf2e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.603969 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmgbs\" (UniqueName: \"kubernetes.io/projected/481d6463-07ac-427e-b5f6-f85143ebf2e0-kube-api-access-fmgbs\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.634576 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481d6463-07ac-427e-b5f6-f85143ebf2e0-config" (OuterVolumeSpecName: "config") pod "481d6463-07ac-427e-b5f6-f85143ebf2e0" (UID: "481d6463-07ac-427e-b5f6-f85143ebf2e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:10 crc kubenswrapper[4964]: I1004 02:57:10.705368 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/481d6463-07ac-427e-b5f6-f85143ebf2e0-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.495754 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.517208 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56875d85c6-lvzcs" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.568415 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="ceilometer-central-agent" containerID="cri-o://be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73" gracePeriod=30 Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.568525 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e531d8d-5769-4736-b8cd-0121cc087e2e","Type":"ContainerStarted","Data":"7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d"} Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.568571 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.569918 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="proxy-httpd" containerID="cri-o://7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d" gracePeriod=30 Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.570024 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="sg-core" containerID="cri-o://8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867" gracePeriod=30 Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.570104 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="ceilometer-notification-agent" containerID="cri-o://0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e" gracePeriod=30 Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.612421 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.736660134 podStartE2EDuration="40.612401924s" podCreationTimestamp="2025-10-04 02:56:31 +0000 UTC" firstStartedPulling="2025-10-04 02:56:32.528379221 +0000 UTC m=+972.425337859" lastFinishedPulling="2025-10-04 02:57:10.404121011 +0000 UTC m=+1010.301079649" observedRunningTime="2025-10-04 02:57:11.608772487 +0000 UTC m=+1011.505731145" watchObservedRunningTime="2025-10-04 02:57:11.612401924 +0000 UTC m=+1011.509360562" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.613500 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6fc588d58-bq7fk"] Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.613793 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6fc588d58-bq7fk" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api-log" containerID="cri-o://55e60088f904e46be4b03ea661defb4b742408eb226fc01bc979281f208fb107" gracePeriod=30 Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.613965 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6fc588d58-bq7fk" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api" containerID="cri-o://3f6ad1590cdafd238253a80d892b42c7c494258e10f46bb4176ecb5113422e78" gracePeriod=30 Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.662261 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fc588d58-bq7fk" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": EOF" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.788352 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-nv7b4"] Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.788594 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" podUID="11b61102-db4d-41f3-975b-bed490026bf4" containerName="dnsmasq-dns" containerID="cri-o://7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f" gracePeriod=10 Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.796910 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.864753 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-r2wfw"] Oct 04 02:57:11 crc kubenswrapper[4964]: E1004 02:57:11.865155 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481d6463-07ac-427e-b5f6-f85143ebf2e0" containerName="neutron-db-sync" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.865174 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="481d6463-07ac-427e-b5f6-f85143ebf2e0" containerName="neutron-db-sync" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.865344 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="481d6463-07ac-427e-b5f6-f85143ebf2e0" containerName="neutron-db-sync" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.866202 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.909684 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-r2wfw"] Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.929233 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-dns-svc\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.929285 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.929393 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-config\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.929431 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x29dn\" (UniqueName: \"kubernetes.io/projected/ee10b681-dd61-4ff3-a697-31cf616048cb-kube-api-access-x29dn\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.929454 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:11 crc kubenswrapper[4964]: I1004 02:57:11.994292 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5cb49d4c48-qflxg"] Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.004142 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.009117 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.009925 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-g7kwd" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.010013 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.011448 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.028710 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5cb49d4c48-qflxg"] Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.030443 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x29dn\" (UniqueName: \"kubernetes.io/projected/ee10b681-dd61-4ff3-a697-31cf616048cb-kube-api-access-x29dn\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.030488 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.030662 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-dns-svc\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.030686 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.030739 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-config\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.031543 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-config\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.032281 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.032759 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-dns-svc\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.033227 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.064785 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x29dn\" (UniqueName: \"kubernetes.io/projected/ee10b681-dd61-4ff3-a697-31cf616048cb-kube-api-access-x29dn\") pod \"dnsmasq-dns-6bb684768f-r2wfw\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.131502 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-config\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.131538 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-httpd-config\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.131560 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpr2p\" (UniqueName: \"kubernetes.io/projected/36580143-625b-4a5f-9ac1-890853622671-kube-api-access-tpr2p\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.131627 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-ovndb-tls-certs\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.131842 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-combined-ca-bundle\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.224268 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.235704 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-ovndb-tls-certs\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.235764 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-combined-ca-bundle\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.235840 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-config\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.235858 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-httpd-config\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.235876 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpr2p\" (UniqueName: \"kubernetes.io/projected/36580143-625b-4a5f-9ac1-890853622671-kube-api-access-tpr2p\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.241364 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-httpd-config\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.242456 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-config\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.249250 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-combined-ca-bundle\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.249683 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-ovndb-tls-certs\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.260356 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpr2p\" (UniqueName: \"kubernetes.io/projected/36580143-625b-4a5f-9ac1-890853622671-kube-api-access-tpr2p\") pod \"neutron-5cb49d4c48-qflxg\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.428856 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.551150 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.579056 4964 generic.go:334] "Generic (PLEG): container finished" podID="11b61102-db4d-41f3-975b-bed490026bf4" containerID="7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f" exitCode=0 Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.579108 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" event={"ID":"11b61102-db4d-41f3-975b-bed490026bf4","Type":"ContainerDied","Data":"7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f"} Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.579133 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" event={"ID":"11b61102-db4d-41f3-975b-bed490026bf4","Type":"ContainerDied","Data":"af889d9956c875209170cde0d3d9909419fa89d707027853a73a736f75e10a05"} Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.579149 4964 scope.go:117] "RemoveContainer" containerID="7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.579259 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-nv7b4" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.589873 4964 generic.go:334] "Generic (PLEG): container finished" podID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerID="7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d" exitCode=0 Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.589898 4964 generic.go:334] "Generic (PLEG): container finished" podID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerID="8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867" exitCode=2 Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.589906 4964 generic.go:334] "Generic (PLEG): container finished" podID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerID="be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73" exitCode=0 Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.589943 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e531d8d-5769-4736-b8cd-0121cc087e2e","Type":"ContainerDied","Data":"7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d"} Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.589965 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e531d8d-5769-4736-b8cd-0121cc087e2e","Type":"ContainerDied","Data":"8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867"} Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.589974 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e531d8d-5769-4736-b8cd-0121cc087e2e","Type":"ContainerDied","Data":"be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73"} Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.591476 4964 generic.go:334] "Generic (PLEG): container finished" podID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerID="55e60088f904e46be4b03ea661defb4b742408eb226fc01bc979281f208fb107" exitCode=143 Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.591498 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fc588d58-bq7fk" event={"ID":"1cb5d6a7-b37d-4edb-8210-97e60614fc0e","Type":"ContainerDied","Data":"55e60088f904e46be4b03ea661defb4b742408eb226fc01bc979281f208fb107"} Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.644795 4964 scope.go:117] "RemoveContainer" containerID="0381ea748c551878777ed429a2f538d1a9be67bdff7c3d71bf68dc6f0d19e89e" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.649162 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-dns-svc\") pod \"11b61102-db4d-41f3-975b-bed490026bf4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.649215 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-ovsdbserver-nb\") pod \"11b61102-db4d-41f3-975b-bed490026bf4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.649243 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpzg5\" (UniqueName: \"kubernetes.io/projected/11b61102-db4d-41f3-975b-bed490026bf4-kube-api-access-fpzg5\") pod \"11b61102-db4d-41f3-975b-bed490026bf4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.649336 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-ovsdbserver-sb\") pod \"11b61102-db4d-41f3-975b-bed490026bf4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.649386 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-config\") pod \"11b61102-db4d-41f3-975b-bed490026bf4\" (UID: \"11b61102-db4d-41f3-975b-bed490026bf4\") " Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.659463 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11b61102-db4d-41f3-975b-bed490026bf4-kube-api-access-fpzg5" (OuterVolumeSpecName: "kube-api-access-fpzg5") pod "11b61102-db4d-41f3-975b-bed490026bf4" (UID: "11b61102-db4d-41f3-975b-bed490026bf4"). InnerVolumeSpecName "kube-api-access-fpzg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.710952 4964 scope.go:117] "RemoveContainer" containerID="7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f" Oct 04 02:57:12 crc kubenswrapper[4964]: E1004 02:57:12.718547 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f\": container with ID starting with 7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f not found: ID does not exist" containerID="7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.718585 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f"} err="failed to get container status \"7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f\": rpc error: code = NotFound desc = could not find container \"7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f\": container with ID starting with 7d0327bb2e917674acd987a54f739cec1b728299cad9b739e6ce3a217938284f not found: ID does not exist" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.718652 4964 scope.go:117] "RemoveContainer" containerID="0381ea748c551878777ed429a2f538d1a9be67bdff7c3d71bf68dc6f0d19e89e" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.720672 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-config" (OuterVolumeSpecName: "config") pod "11b61102-db4d-41f3-975b-bed490026bf4" (UID: "11b61102-db4d-41f3-975b-bed490026bf4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:12 crc kubenswrapper[4964]: E1004 02:57:12.722038 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0381ea748c551878777ed429a2f538d1a9be67bdff7c3d71bf68dc6f0d19e89e\": container with ID starting with 0381ea748c551878777ed429a2f538d1a9be67bdff7c3d71bf68dc6f0d19e89e not found: ID does not exist" containerID="0381ea748c551878777ed429a2f538d1a9be67bdff7c3d71bf68dc6f0d19e89e" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.722087 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0381ea748c551878777ed429a2f538d1a9be67bdff7c3d71bf68dc6f0d19e89e"} err="failed to get container status \"0381ea748c551878777ed429a2f538d1a9be67bdff7c3d71bf68dc6f0d19e89e\": rpc error: code = NotFound desc = could not find container \"0381ea748c551878777ed429a2f538d1a9be67bdff7c3d71bf68dc6f0d19e89e\": container with ID starting with 0381ea748c551878777ed429a2f538d1a9be67bdff7c3d71bf68dc6f0d19e89e not found: ID does not exist" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.727172 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "11b61102-db4d-41f3-975b-bed490026bf4" (UID: "11b61102-db4d-41f3-975b-bed490026bf4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.734388 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11b61102-db4d-41f3-975b-bed490026bf4" (UID: "11b61102-db4d-41f3-975b-bed490026bf4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.744036 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "11b61102-db4d-41f3-975b-bed490026bf4" (UID: "11b61102-db4d-41f3-975b-bed490026bf4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.751231 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.751281 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.751293 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.751301 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11b61102-db4d-41f3-975b-bed490026bf4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.751309 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpzg5\" (UniqueName: \"kubernetes.io/projected/11b61102-db4d-41f3-975b-bed490026bf4-kube-api-access-fpzg5\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.821019 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-r2wfw"] Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.905367 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-nv7b4"] Oct 04 02:57:12 crc kubenswrapper[4964]: I1004 02:57:12.913906 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-nv7b4"] Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.161150 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5cb49d4c48-qflxg"] Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.600375 4964 generic.go:334] "Generic (PLEG): container finished" podID="ee10b681-dd61-4ff3-a697-31cf616048cb" containerID="d09f6246519b0012634e9bd03ce0d8c51c14b444b8c2d0366344d6f6da90ddf5" exitCode=0 Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.600642 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" event={"ID":"ee10b681-dd61-4ff3-a697-31cf616048cb","Type":"ContainerDied","Data":"d09f6246519b0012634e9bd03ce0d8c51c14b444b8c2d0366344d6f6da90ddf5"} Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.600669 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" event={"ID":"ee10b681-dd61-4ff3-a697-31cf616048cb","Type":"ContainerStarted","Data":"8a6718b4b77aa343a82e6a5bc48914a581373977c711f6bd080bb5a2e1be7301"} Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.603335 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cb49d4c48-qflxg" event={"ID":"36580143-625b-4a5f-9ac1-890853622671","Type":"ContainerStarted","Data":"02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5"} Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.603374 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cb49d4c48-qflxg" event={"ID":"36580143-625b-4a5f-9ac1-890853622671","Type":"ContainerStarted","Data":"d013e7e2c0be437c4919a8d8b1d8389dd3bdf9404592f986ccaaebde9af57830"} Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.655291 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.912582 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76d58c8cb5-9l2w8"] Oct 04 02:57:13 crc kubenswrapper[4964]: E1004 02:57:13.913215 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b61102-db4d-41f3-975b-bed490026bf4" containerName="dnsmasq-dns" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.913294 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b61102-db4d-41f3-975b-bed490026bf4" containerName="dnsmasq-dns" Oct 04 02:57:13 crc kubenswrapper[4964]: E1004 02:57:13.913387 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b61102-db4d-41f3-975b-bed490026bf4" containerName="init" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.913454 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b61102-db4d-41f3-975b-bed490026bf4" containerName="init" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.913762 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b61102-db4d-41f3-975b-bed490026bf4" containerName="dnsmasq-dns" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.914880 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.917416 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.917441 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.940526 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76d58c8cb5-9l2w8"] Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.972292 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-combined-ca-bundle\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.972364 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-config\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.972840 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-httpd-config\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.972890 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-ovndb-tls-certs\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.972942 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-internal-tls-certs\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.973000 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xzqt\" (UniqueName: \"kubernetes.io/projected/cce3503b-b92d-434a-b056-fa6832cff6d4-kube-api-access-8xzqt\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:13 crc kubenswrapper[4964]: I1004 02:57:13.973225 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-public-tls-certs\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.075173 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xzqt\" (UniqueName: \"kubernetes.io/projected/cce3503b-b92d-434a-b056-fa6832cff6d4-kube-api-access-8xzqt\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.075250 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-public-tls-certs\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.075295 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-combined-ca-bundle\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.075328 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-config\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.075372 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-httpd-config\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.075393 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-ovndb-tls-certs\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.075414 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-internal-tls-certs\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.080147 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-config\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.080224 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-ovndb-tls-certs\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.080253 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-internal-tls-certs\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.081017 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-httpd-config\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.081673 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-combined-ca-bundle\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.082007 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cce3503b-b92d-434a-b056-fa6832cff6d4-public-tls-certs\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.098206 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xzqt\" (UniqueName: \"kubernetes.io/projected/cce3503b-b92d-434a-b056-fa6832cff6d4-kube-api-access-8xzqt\") pod \"neutron-76d58c8cb5-9l2w8\" (UID: \"cce3503b-b92d-434a-b056-fa6832cff6d4\") " pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.233220 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.630797 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cb49d4c48-qflxg" event={"ID":"36580143-625b-4a5f-9ac1-890853622671","Type":"ContainerStarted","Data":"c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd"} Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.631193 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.642994 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" event={"ID":"ee10b681-dd61-4ff3-a697-31cf616048cb","Type":"ContainerStarted","Data":"79450f558d10597308805369374df227f15cab75d6462dd1ed0276dccc35a794"} Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.643257 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.659400 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5cb49d4c48-qflxg" podStartSLOduration=3.65937614 podStartE2EDuration="3.65937614s" podCreationTimestamp="2025-10-04 02:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:57:14.6564886 +0000 UTC m=+1014.553447258" watchObservedRunningTime="2025-10-04 02:57:14.65937614 +0000 UTC m=+1014.556334788" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.690057 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" podStartSLOduration=3.690034941 podStartE2EDuration="3.690034941s" podCreationTimestamp="2025-10-04 02:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:57:14.682777195 +0000 UTC m=+1014.579735883" watchObservedRunningTime="2025-10-04 02:57:14.690034941 +0000 UTC m=+1014.586993589" Oct 04 02:57:14 crc kubenswrapper[4964]: I1004 02:57:14.860417 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11b61102-db4d-41f3-975b-bed490026bf4" path="/var/lib/kubelet/pods/11b61102-db4d-41f3-975b-bed490026bf4/volumes" Oct 04 02:57:15 crc kubenswrapper[4964]: I1004 02:57:15.041098 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76d58c8cb5-9l2w8"] Oct 04 02:57:15 crc kubenswrapper[4964]: W1004 02:57:15.049829 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcce3503b_b92d_434a_b056_fa6832cff6d4.slice/crio-1e65d047f57cafb6a90ccbefe30d6ffbe8a5d2d55c8304a29adab1849aae6538 WatchSource:0}: Error finding container 1e65d047f57cafb6a90ccbefe30d6ffbe8a5d2d55c8304a29adab1849aae6538: Status 404 returned error can't find the container with id 1e65d047f57cafb6a90ccbefe30d6ffbe8a5d2d55c8304a29adab1849aae6538 Oct 04 02:57:15 crc kubenswrapper[4964]: I1004 02:57:15.652563 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76d58c8cb5-9l2w8" event={"ID":"cce3503b-b92d-434a-b056-fa6832cff6d4","Type":"ContainerStarted","Data":"37263ae643f752e747ba7e27a29de89b2292ed8da6e7703985bdcf772505209f"} Oct 04 02:57:15 crc kubenswrapper[4964]: I1004 02:57:15.652843 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76d58c8cb5-9l2w8" event={"ID":"cce3503b-b92d-434a-b056-fa6832cff6d4","Type":"ContainerStarted","Data":"965e42921a0598900c5d3a0e868126a4d8a524f9fcb2e62370c1089f70672a7c"} Oct 04 02:57:15 crc kubenswrapper[4964]: I1004 02:57:15.652852 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76d58c8cb5-9l2w8" event={"ID":"cce3503b-b92d-434a-b056-fa6832cff6d4","Type":"ContainerStarted","Data":"1e65d047f57cafb6a90ccbefe30d6ffbe8a5d2d55c8304a29adab1849aae6538"} Oct 04 02:57:15 crc kubenswrapper[4964]: I1004 02:57:15.683969 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76d58c8cb5-9l2w8" podStartSLOduration=2.683949734 podStartE2EDuration="2.683949734s" podCreationTimestamp="2025-10-04 02:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:57:15.681181307 +0000 UTC m=+1015.578139975" watchObservedRunningTime="2025-10-04 02:57:15.683949734 +0000 UTC m=+1015.580908392" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.566362 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.629945 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-scripts\") pod \"8e531d8d-5769-4736-b8cd-0121cc087e2e\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.630194 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl8dd\" (UniqueName: \"kubernetes.io/projected/8e531d8d-5769-4736-b8cd-0121cc087e2e-kube-api-access-fl8dd\") pod \"8e531d8d-5769-4736-b8cd-0121cc087e2e\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.630270 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e531d8d-5769-4736-b8cd-0121cc087e2e-log-httpd\") pod \"8e531d8d-5769-4736-b8cd-0121cc087e2e\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.630361 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-combined-ca-bundle\") pod \"8e531d8d-5769-4736-b8cd-0121cc087e2e\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.630779 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e531d8d-5769-4736-b8cd-0121cc087e2e-run-httpd\") pod \"8e531d8d-5769-4736-b8cd-0121cc087e2e\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.630906 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-sg-core-conf-yaml\") pod \"8e531d8d-5769-4736-b8cd-0121cc087e2e\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.630965 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-config-data\") pod \"8e531d8d-5769-4736-b8cd-0121cc087e2e\" (UID: \"8e531d8d-5769-4736-b8cd-0121cc087e2e\") " Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.632294 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e531d8d-5769-4736-b8cd-0121cc087e2e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8e531d8d-5769-4736-b8cd-0121cc087e2e" (UID: "8e531d8d-5769-4736-b8cd-0121cc087e2e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.634659 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e531d8d-5769-4736-b8cd-0121cc087e2e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8e531d8d-5769-4736-b8cd-0121cc087e2e" (UID: "8e531d8d-5769-4736-b8cd-0121cc087e2e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.636244 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-scripts" (OuterVolumeSpecName: "scripts") pod "8e531d8d-5769-4736-b8cd-0121cc087e2e" (UID: "8e531d8d-5769-4736-b8cd-0121cc087e2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.638872 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e531d8d-5769-4736-b8cd-0121cc087e2e-kube-api-access-fl8dd" (OuterVolumeSpecName: "kube-api-access-fl8dd") pod "8e531d8d-5769-4736-b8cd-0121cc087e2e" (UID: "8e531d8d-5769-4736-b8cd-0121cc087e2e"). InnerVolumeSpecName "kube-api-access-fl8dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.695163 4964 generic.go:334] "Generic (PLEG): container finished" podID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerID="0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e" exitCode=0 Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.695258 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.695272 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e531d8d-5769-4736-b8cd-0121cc087e2e","Type":"ContainerDied","Data":"0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e"} Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.695676 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e531d8d-5769-4736-b8cd-0121cc087e2e","Type":"ContainerDied","Data":"58287914ddd70e30acbf7b2f02ccd87176266e94a31de0d0f60f84e4e99d4e37"} Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.695700 4964 scope.go:117] "RemoveContainer" containerID="7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.695775 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.698685 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8e531d8d-5769-4736-b8cd-0121cc087e2e" (UID: "8e531d8d-5769-4736-b8cd-0121cc087e2e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.705969 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e531d8d-5769-4736-b8cd-0121cc087e2e" (UID: "8e531d8d-5769-4736-b8cd-0121cc087e2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.718118 4964 scope.go:117] "RemoveContainer" containerID="8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.733210 4964 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.733233 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.733243 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl8dd\" (UniqueName: \"kubernetes.io/projected/8e531d8d-5769-4736-b8cd-0121cc087e2e-kube-api-access-fl8dd\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.733250 4964 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e531d8d-5769-4736-b8cd-0121cc087e2e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.733259 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.733267 4964 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e531d8d-5769-4736-b8cd-0121cc087e2e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.735816 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-config-data" (OuterVolumeSpecName: "config-data") pod "8e531d8d-5769-4736-b8cd-0121cc087e2e" (UID: "8e531d8d-5769-4736-b8cd-0121cc087e2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.736410 4964 scope.go:117] "RemoveContainer" containerID="0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.764085 4964 scope.go:117] "RemoveContainer" containerID="be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.791215 4964 scope.go:117] "RemoveContainer" containerID="7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d" Oct 04 02:57:16 crc kubenswrapper[4964]: E1004 02:57:16.792013 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d\": container with ID starting with 7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d not found: ID does not exist" containerID="7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.792043 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d"} err="failed to get container status \"7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d\": rpc error: code = NotFound desc = could not find container \"7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d\": container with ID starting with 7ac3feddb2d6f0a426bfe337f732419a83e40165d77803230a04f3b71fee965d not found: ID does not exist" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.792065 4964 scope.go:117] "RemoveContainer" containerID="8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867" Oct 04 02:57:16 crc kubenswrapper[4964]: E1004 02:57:16.792377 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867\": container with ID starting with 8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867 not found: ID does not exist" containerID="8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.792401 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867"} err="failed to get container status \"8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867\": rpc error: code = NotFound desc = could not find container \"8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867\": container with ID starting with 8ef0878061622cda1c154452219b93ed6c75505fadf2d443df39f43a0333f867 not found: ID does not exist" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.792413 4964 scope.go:117] "RemoveContainer" containerID="0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e" Oct 04 02:57:16 crc kubenswrapper[4964]: E1004 02:57:16.792626 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e\": container with ID starting with 0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e not found: ID does not exist" containerID="0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.792647 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e"} err="failed to get container status \"0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e\": rpc error: code = NotFound desc = could not find container \"0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e\": container with ID starting with 0809a491687d9a54e43b2778c1219519c7bc2445f90ae7f8903490bea6f57d0e not found: ID does not exist" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.792660 4964 scope.go:117] "RemoveContainer" containerID="be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73" Oct 04 02:57:16 crc kubenswrapper[4964]: E1004 02:57:16.792875 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73\": container with ID starting with be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73 not found: ID does not exist" containerID="be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.792898 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73"} err="failed to get container status \"be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73\": rpc error: code = NotFound desc = could not find container \"be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73\": container with ID starting with be1e3fb11f0a12bf9efa7291c4d91b3a114b14c15d94394925043662efaa4d73 not found: ID does not exist" Oct 04 02:57:16 crc kubenswrapper[4964]: I1004 02:57:16.835260 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e531d8d-5769-4736-b8cd-0121cc087e2e-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.020575 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.037136 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.042862 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:17 crc kubenswrapper[4964]: E1004 02:57:17.046184 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="ceilometer-central-agent" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.046206 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="ceilometer-central-agent" Oct 04 02:57:17 crc kubenswrapper[4964]: E1004 02:57:17.046223 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="proxy-httpd" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.046229 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="proxy-httpd" Oct 04 02:57:17 crc kubenswrapper[4964]: E1004 02:57:17.046243 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="ceilometer-notification-agent" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.046250 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="ceilometer-notification-agent" Oct 04 02:57:17 crc kubenswrapper[4964]: E1004 02:57:17.046274 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="sg-core" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.046280 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="sg-core" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.046435 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="ceilometer-notification-agent" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.046452 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="ceilometer-central-agent" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.046471 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="sg-core" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.046478 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" containerName="proxy-httpd" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.048186 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.057051 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.060873 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.061102 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.141277 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-run-httpd\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.141329 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.141366 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-scripts\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.141391 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-config-data\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.141408 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.141434 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2fnx\" (UniqueName: \"kubernetes.io/projected/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-kube-api-access-b2fnx\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.141499 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-log-httpd\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.161889 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.201097 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-855ccd47c4-qrtzn" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.249532 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-log-httpd\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.249662 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-run-httpd\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.249713 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.249779 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-scripts\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.249826 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-config-data\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.249854 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.249891 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2fnx\" (UniqueName: \"kubernetes.io/projected/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-kube-api-access-b2fnx\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.250825 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-run-httpd\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.251080 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-log-httpd\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.254090 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-scripts\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.254865 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-config-data\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.256244 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.257230 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.267158 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2fnx\" (UniqueName: \"kubernetes.io/projected/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-kube-api-access-b2fnx\") pod \"ceilometer-0\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.293592 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fc588d58-bq7fk" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": read tcp 10.217.0.2:50536->10.217.0.145:9311: read: connection reset by peer" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.293742 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fc588d58-bq7fk" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": read tcp 10.217.0.2:50528->10.217.0.145:9311: read: connection reset by peer" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.383068 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.713253 4964 generic.go:334] "Generic (PLEG): container finished" podID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerID="3f6ad1590cdafd238253a80d892b42c7c494258e10f46bb4176ecb5113422e78" exitCode=0 Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.713330 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fc588d58-bq7fk" event={"ID":"1cb5d6a7-b37d-4edb-8210-97e60614fc0e","Type":"ContainerDied","Data":"3f6ad1590cdafd238253a80d892b42c7c494258e10f46bb4176ecb5113422e78"} Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.713726 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fc588d58-bq7fk" event={"ID":"1cb5d6a7-b37d-4edb-8210-97e60614fc0e","Type":"ContainerDied","Data":"018f7a39d862c27408401e014c2e5aea0065f507ae8406834b4b55d1409c25e8"} Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.713743 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="018f7a39d862c27408401e014c2e5aea0065f507ae8406834b4b55d1409c25e8" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.714933 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-c8kkp" event={"ID":"969137a9-7e00-4472-8582-8008c5647750","Type":"ContainerStarted","Data":"957246df7e11551087d052bc3fc241dd0461dbec53b8b0cae9066791872c0f1d"} Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.749200 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-c8kkp" podStartSLOduration=3.203808163 podStartE2EDuration="39.74917471s" podCreationTimestamp="2025-10-04 02:56:38 +0000 UTC" firstStartedPulling="2025-10-04 02:56:39.945322356 +0000 UTC m=+979.842280994" lastFinishedPulling="2025-10-04 02:57:16.490688883 +0000 UTC m=+1016.387647541" observedRunningTime="2025-10-04 02:57:17.735217413 +0000 UTC m=+1017.632176051" watchObservedRunningTime="2025-10-04 02:57:17.74917471 +0000 UTC m=+1017.646133348" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.757920 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.864412 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-config-data\") pod \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.864532 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-226vr\" (UniqueName: \"kubernetes.io/projected/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-kube-api-access-226vr\") pod \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.864554 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-logs\") pod \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.864657 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-config-data-custom\") pod \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.864732 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-combined-ca-bundle\") pod \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\" (UID: \"1cb5d6a7-b37d-4edb-8210-97e60614fc0e\") " Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.865166 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-logs" (OuterVolumeSpecName: "logs") pod "1cb5d6a7-b37d-4edb-8210-97e60614fc0e" (UID: "1cb5d6a7-b37d-4edb-8210-97e60614fc0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.870381 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1cb5d6a7-b37d-4edb-8210-97e60614fc0e" (UID: "1cb5d6a7-b37d-4edb-8210-97e60614fc0e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.873753 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-kube-api-access-226vr" (OuterVolumeSpecName: "kube-api-access-226vr") pod "1cb5d6a7-b37d-4edb-8210-97e60614fc0e" (UID: "1cb5d6a7-b37d-4edb-8210-97e60614fc0e"). InnerVolumeSpecName "kube-api-access-226vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.895432 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:17 crc kubenswrapper[4964]: W1004 02:57:17.895942 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6fb06b3_f829_41c8_a37d_90f6d38ee79f.slice/crio-3b3573a49e40e9e680c316ceec77a20bb15ff9af0a658183f3ec0b20985f4f93 WatchSource:0}: Error finding container 3b3573a49e40e9e680c316ceec77a20bb15ff9af0a658183f3ec0b20985f4f93: Status 404 returned error can't find the container with id 3b3573a49e40e9e680c316ceec77a20bb15ff9af0a658183f3ec0b20985f4f93 Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.895940 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cb5d6a7-b37d-4edb-8210-97e60614fc0e" (UID: "1cb5d6a7-b37d-4edb-8210-97e60614fc0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.925794 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-config-data" (OuterVolumeSpecName: "config-data") pod "1cb5d6a7-b37d-4edb-8210-97e60614fc0e" (UID: "1cb5d6a7-b37d-4edb-8210-97e60614fc0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.967313 4964 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.967343 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.967353 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.967362 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-226vr\" (UniqueName: \"kubernetes.io/projected/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-kube-api-access-226vr\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:17 crc kubenswrapper[4964]: I1004 02:57:17.967375 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb5d6a7-b37d-4edb-8210-97e60614fc0e-logs\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:18 crc kubenswrapper[4964]: I1004 02:57:18.724354 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6fb06b3-f829-41c8-a37d-90f6d38ee79f","Type":"ContainerStarted","Data":"2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683"} Oct 04 02:57:18 crc kubenswrapper[4964]: I1004 02:57:18.724865 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6fb06b3-f829-41c8-a37d-90f6d38ee79f","Type":"ContainerStarted","Data":"3b3573a49e40e9e680c316ceec77a20bb15ff9af0a658183f3ec0b20985f4f93"} Oct 04 02:57:18 crc kubenswrapper[4964]: I1004 02:57:18.724382 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fc588d58-bq7fk" Oct 04 02:57:18 crc kubenswrapper[4964]: I1004 02:57:18.772215 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6fc588d58-bq7fk"] Oct 04 02:57:18 crc kubenswrapper[4964]: I1004 02:57:18.781171 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6fc588d58-bq7fk"] Oct 04 02:57:18 crc kubenswrapper[4964]: I1004 02:57:18.864469 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" path="/var/lib/kubelet/pods/1cb5d6a7-b37d-4edb-8210-97e60614fc0e/volumes" Oct 04 02:57:18 crc kubenswrapper[4964]: I1004 02:57:18.865497 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e531d8d-5769-4736-b8cd-0121cc087e2e" path="/var/lib/kubelet/pods/8e531d8d-5769-4736-b8cd-0121cc087e2e/volumes" Oct 04 02:57:19 crc kubenswrapper[4964]: I1004 02:57:19.214000 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-58576f76db-7mmj9" Oct 04 02:57:19 crc kubenswrapper[4964]: I1004 02:57:19.738592 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6fb06b3-f829-41c8-a37d-90f6d38ee79f","Type":"ContainerStarted","Data":"5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83"} Oct 04 02:57:20 crc kubenswrapper[4964]: I1004 02:57:20.784298 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6fb06b3-f829-41c8-a37d-90f6d38ee79f","Type":"ContainerStarted","Data":"2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414"} Oct 04 02:57:21 crc kubenswrapper[4964]: I1004 02:57:21.793361 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6fb06b3-f829-41c8-a37d-90f6d38ee79f","Type":"ContainerStarted","Data":"b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809"} Oct 04 02:57:21 crc kubenswrapper[4964]: I1004 02:57:21.795546 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 02:57:21 crc kubenswrapper[4964]: I1004 02:57:21.797032 4964 generic.go:334] "Generic (PLEG): container finished" podID="969137a9-7e00-4472-8582-8008c5647750" containerID="957246df7e11551087d052bc3fc241dd0461dbec53b8b0cae9066791872c0f1d" exitCode=0 Oct 04 02:57:21 crc kubenswrapper[4964]: I1004 02:57:21.797058 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-c8kkp" event={"ID":"969137a9-7e00-4472-8582-8008c5647750","Type":"ContainerDied","Data":"957246df7e11551087d052bc3fc241dd0461dbec53b8b0cae9066791872c0f1d"} Oct 04 02:57:21 crc kubenswrapper[4964]: I1004 02:57:21.848078 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.252176258 podStartE2EDuration="4.84805694s" podCreationTimestamp="2025-10-04 02:57:17 +0000 UTC" firstStartedPulling="2025-10-04 02:57:17.89853708 +0000 UTC m=+1017.795495708" lastFinishedPulling="2025-10-04 02:57:21.494417722 +0000 UTC m=+1021.391376390" observedRunningTime="2025-10-04 02:57:21.828898486 +0000 UTC m=+1021.725857134" watchObservedRunningTime="2025-10-04 02:57:21.84805694 +0000 UTC m=+1021.745015598" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.219935 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 04 02:57:22 crc kubenswrapper[4964]: E1004 02:57:22.220271 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api-log" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.220289 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api-log" Oct 04 02:57:22 crc kubenswrapper[4964]: E1004 02:57:22.220301 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.220308 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.220484 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.220512 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api-log" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.221124 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.223782 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.224062 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.224495 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zzwdx" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.227870 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.253663 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.302008 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-p8tj2"] Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.302431 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" podUID="488b1daa-7288-46cf-b351-2d6cec22c917" containerName="dnsmasq-dns" containerID="cri-o://4e914fdd841b439ed4574800421d4923941a5c357d0ebcaad3d08e9e4a9e3059" gracePeriod=10 Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.406609 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z58r\" (UniqueName: \"kubernetes.io/projected/5ca73069-dcdb-4be1-bb50-3455eed66412-kube-api-access-7z58r\") pod \"openstackclient\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.407041 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca73069-dcdb-4be1-bb50-3455eed66412-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.407070 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ca73069-dcdb-4be1-bb50-3455eed66412-openstack-config\") pod \"openstackclient\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.407267 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ca73069-dcdb-4be1-bb50-3455eed66412-openstack-config-secret\") pod \"openstackclient\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.435903 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 04 02:57:22 crc kubenswrapper[4964]: E1004 02:57:22.436525 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-7z58r openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="5ca73069-dcdb-4be1-bb50-3455eed66412" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.458084 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.469712 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.470807 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.483416 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.508653 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca73069-dcdb-4be1-bb50-3455eed66412-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.508704 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ca73069-dcdb-4be1-bb50-3455eed66412-openstack-config\") pod \"openstackclient\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.508759 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ca73069-dcdb-4be1-bb50-3455eed66412-openstack-config-secret\") pod \"openstackclient\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.508818 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z58r\" (UniqueName: \"kubernetes.io/projected/5ca73069-dcdb-4be1-bb50-3455eed66412-kube-api-access-7z58r\") pod \"openstackclient\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.509791 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ca73069-dcdb-4be1-bb50-3455eed66412-openstack-config\") pod \"openstackclient\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: E1004 02:57:22.510372 4964 projected.go:194] Error preparing data for projected volume kube-api-access-7z58r for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (5ca73069-dcdb-4be1-bb50-3455eed66412) does not match the UID in record. The object might have been deleted and then recreated Oct 04 02:57:22 crc kubenswrapper[4964]: E1004 02:57:22.510413 4964 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ca73069-dcdb-4be1-bb50-3455eed66412-kube-api-access-7z58r podName:5ca73069-dcdb-4be1-bb50-3455eed66412 nodeName:}" failed. No retries permitted until 2025-10-04 02:57:23.010399318 +0000 UTC m=+1022.907357956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7z58r" (UniqueName: "kubernetes.io/projected/5ca73069-dcdb-4be1-bb50-3455eed66412-kube-api-access-7z58r") pod "openstackclient" (UID: "5ca73069-dcdb-4be1-bb50-3455eed66412") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (5ca73069-dcdb-4be1-bb50-3455eed66412) does not match the UID in record. The object might have been deleted and then recreated Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.516340 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca73069-dcdb-4be1-bb50-3455eed66412-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.528226 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ca73069-dcdb-4be1-bb50-3455eed66412-openstack-config-secret\") pod \"openstackclient\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.610249 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0e34537c-4b0f-4683-bf8a-3b56e44424b1-openstack-config-secret\") pod \"openstackclient\" (UID: \"0e34537c-4b0f-4683-bf8a-3b56e44424b1\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.610292 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49f7\" (UniqueName: \"kubernetes.io/projected/0e34537c-4b0f-4683-bf8a-3b56e44424b1-kube-api-access-b49f7\") pod \"openstackclient\" (UID: \"0e34537c-4b0f-4683-bf8a-3b56e44424b1\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.610357 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0e34537c-4b0f-4683-bf8a-3b56e44424b1-openstack-config\") pod \"openstackclient\" (UID: \"0e34537c-4b0f-4683-bf8a-3b56e44424b1\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.610397 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e34537c-4b0f-4683-bf8a-3b56e44424b1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0e34537c-4b0f-4683-bf8a-3b56e44424b1\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.621794 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fc588d58-bq7fk" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.622153 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fc588d58-bq7fk" podUID="1cb5d6a7-b37d-4edb-8210-97e60614fc0e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.712882 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0e34537c-4b0f-4683-bf8a-3b56e44424b1-openstack-config\") pod \"openstackclient\" (UID: \"0e34537c-4b0f-4683-bf8a-3b56e44424b1\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.713186 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e34537c-4b0f-4683-bf8a-3b56e44424b1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0e34537c-4b0f-4683-bf8a-3b56e44424b1\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.713279 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0e34537c-4b0f-4683-bf8a-3b56e44424b1-openstack-config-secret\") pod \"openstackclient\" (UID: \"0e34537c-4b0f-4683-bf8a-3b56e44424b1\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.713300 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49f7\" (UniqueName: \"kubernetes.io/projected/0e34537c-4b0f-4683-bf8a-3b56e44424b1-kube-api-access-b49f7\") pod \"openstackclient\" (UID: \"0e34537c-4b0f-4683-bf8a-3b56e44424b1\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.714359 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0e34537c-4b0f-4683-bf8a-3b56e44424b1-openstack-config\") pod \"openstackclient\" (UID: \"0e34537c-4b0f-4683-bf8a-3b56e44424b1\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.716547 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e34537c-4b0f-4683-bf8a-3b56e44424b1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0e34537c-4b0f-4683-bf8a-3b56e44424b1\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.717001 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0e34537c-4b0f-4683-bf8a-3b56e44424b1-openstack-config-secret\") pod \"openstackclient\" (UID: \"0e34537c-4b0f-4683-bf8a-3b56e44424b1\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.729512 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49f7\" (UniqueName: \"kubernetes.io/projected/0e34537c-4b0f-4683-bf8a-3b56e44424b1-kube-api-access-b49f7\") pod \"openstackclient\" (UID: \"0e34537c-4b0f-4683-bf8a-3b56e44424b1\") " pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.803845 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.811038 4964 generic.go:334] "Generic (PLEG): container finished" podID="488b1daa-7288-46cf-b351-2d6cec22c917" containerID="4e914fdd841b439ed4574800421d4923941a5c357d0ebcaad3d08e9e4a9e3059" exitCode=0 Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.811267 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" event={"ID":"488b1daa-7288-46cf-b351-2d6cec22c917","Type":"ContainerDied","Data":"4e914fdd841b439ed4574800421d4923941a5c357d0ebcaad3d08e9e4a9e3059"} Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.811313 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" event={"ID":"488b1daa-7288-46cf-b351-2d6cec22c917","Type":"ContainerDied","Data":"bfff716ee2e055bdb0b8b0f9e9818d334ac5b0ea34005d0190d19bf146fea0c4"} Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.811324 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfff716ee2e055bdb0b8b0f9e9818d334ac5b0ea34005d0190d19bf146fea0c4" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.812327 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.816109 4964 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="5ca73069-dcdb-4be1-bb50-3455eed66412" podUID="0e34537c-4b0f-4683-bf8a-3b56e44424b1" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.825434 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.898143 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.924966 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz5p6\" (UniqueName: \"kubernetes.io/projected/488b1daa-7288-46cf-b351-2d6cec22c917-kube-api-access-rz5p6\") pod \"488b1daa-7288-46cf-b351-2d6cec22c917\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.925013 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-dns-svc\") pod \"488b1daa-7288-46cf-b351-2d6cec22c917\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.925032 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-ovsdbserver-nb\") pod \"488b1daa-7288-46cf-b351-2d6cec22c917\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.925061 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ca73069-dcdb-4be1-bb50-3455eed66412-openstack-config\") pod \"5ca73069-dcdb-4be1-bb50-3455eed66412\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.925093 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ca73069-dcdb-4be1-bb50-3455eed66412-openstack-config-secret\") pod \"5ca73069-dcdb-4be1-bb50-3455eed66412\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.925356 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z58r\" (UniqueName: \"kubernetes.io/projected/5ca73069-dcdb-4be1-bb50-3455eed66412-kube-api-access-7z58r\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.926797 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ca73069-dcdb-4be1-bb50-3455eed66412-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5ca73069-dcdb-4be1-bb50-3455eed66412" (UID: "5ca73069-dcdb-4be1-bb50-3455eed66412"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.930829 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/488b1daa-7288-46cf-b351-2d6cec22c917-kube-api-access-rz5p6" (OuterVolumeSpecName: "kube-api-access-rz5p6") pod "488b1daa-7288-46cf-b351-2d6cec22c917" (UID: "488b1daa-7288-46cf-b351-2d6cec22c917"). InnerVolumeSpecName "kube-api-access-rz5p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.940190 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca73069-dcdb-4be1-bb50-3455eed66412-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5ca73069-dcdb-4be1-bb50-3455eed66412" (UID: "5ca73069-dcdb-4be1-bb50-3455eed66412"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.975351 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "488b1daa-7288-46cf-b351-2d6cec22c917" (UID: "488b1daa-7288-46cf-b351-2d6cec22c917"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:22 crc kubenswrapper[4964]: I1004 02:57:22.985861 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "488b1daa-7288-46cf-b351-2d6cec22c917" (UID: "488b1daa-7288-46cf-b351-2d6cec22c917"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.025918 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-ovsdbserver-sb\") pod \"488b1daa-7288-46cf-b351-2d6cec22c917\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.025951 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca73069-dcdb-4be1-bb50-3455eed66412-combined-ca-bundle\") pod \"5ca73069-dcdb-4be1-bb50-3455eed66412\" (UID: \"5ca73069-dcdb-4be1-bb50-3455eed66412\") " Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.026053 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-config\") pod \"488b1daa-7288-46cf-b351-2d6cec22c917\" (UID: \"488b1daa-7288-46cf-b351-2d6cec22c917\") " Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.026364 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz5p6\" (UniqueName: \"kubernetes.io/projected/488b1daa-7288-46cf-b351-2d6cec22c917-kube-api-access-rz5p6\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.026374 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.026383 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.026392 4964 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ca73069-dcdb-4be1-bb50-3455eed66412-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.026400 4964 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ca73069-dcdb-4be1-bb50-3455eed66412-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.033439 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca73069-dcdb-4be1-bb50-3455eed66412-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ca73069-dcdb-4be1-bb50-3455eed66412" (UID: "5ca73069-dcdb-4be1-bb50-3455eed66412"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.109590 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-config" (OuterVolumeSpecName: "config") pod "488b1daa-7288-46cf-b351-2d6cec22c917" (UID: "488b1daa-7288-46cf-b351-2d6cec22c917"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.109938 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "488b1daa-7288-46cf-b351-2d6cec22c917" (UID: "488b1daa-7288-46cf-b351-2d6cec22c917"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.128060 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.128092 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca73069-dcdb-4be1-bb50-3455eed66412-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.128101 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/488b1daa-7288-46cf-b351-2d6cec22c917-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.186638 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.331392 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-scripts\") pod \"969137a9-7e00-4472-8582-8008c5647750\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.331532 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/969137a9-7e00-4472-8582-8008c5647750-etc-machine-id\") pod \"969137a9-7e00-4472-8582-8008c5647750\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.331571 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-combined-ca-bundle\") pod \"969137a9-7e00-4472-8582-8008c5647750\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.331668 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-db-sync-config-data\") pod \"969137a9-7e00-4472-8582-8008c5647750\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.331689 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-config-data\") pod \"969137a9-7e00-4472-8582-8008c5647750\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.331698 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969137a9-7e00-4472-8582-8008c5647750-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "969137a9-7e00-4472-8582-8008c5647750" (UID: "969137a9-7e00-4472-8582-8008c5647750"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.331711 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvxgp\" (UniqueName: \"kubernetes.io/projected/969137a9-7e00-4472-8582-8008c5647750-kube-api-access-pvxgp\") pod \"969137a9-7e00-4472-8582-8008c5647750\" (UID: \"969137a9-7e00-4472-8582-8008c5647750\") " Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.332431 4964 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/969137a9-7e00-4472-8582-8008c5647750-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.334438 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/969137a9-7e00-4472-8582-8008c5647750-kube-api-access-pvxgp" (OuterVolumeSpecName: "kube-api-access-pvxgp") pod "969137a9-7e00-4472-8582-8008c5647750" (UID: "969137a9-7e00-4472-8582-8008c5647750"). InnerVolumeSpecName "kube-api-access-pvxgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.334483 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "969137a9-7e00-4472-8582-8008c5647750" (UID: "969137a9-7e00-4472-8582-8008c5647750"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.336518 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-scripts" (OuterVolumeSpecName: "scripts") pod "969137a9-7e00-4472-8582-8008c5647750" (UID: "969137a9-7e00-4472-8582-8008c5647750"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.362797 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "969137a9-7e00-4472-8582-8008c5647750" (UID: "969137a9-7e00-4472-8582-8008c5647750"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.387989 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.421507 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-config-data" (OuterVolumeSpecName: "config-data") pod "969137a9-7e00-4472-8582-8008c5647750" (UID: "969137a9-7e00-4472-8582-8008c5647750"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.433573 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.433634 4964 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.433647 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.433659 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvxgp\" (UniqueName: \"kubernetes.io/projected/969137a9-7e00-4472-8582-8008c5647750-kube-api-access-pvxgp\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.433674 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969137a9-7e00-4472-8582-8008c5647750-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.820511 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0e34537c-4b0f-4683-bf8a-3b56e44424b1","Type":"ContainerStarted","Data":"3eab2641172c35ebf5d475c74f9a738b2e42198d5c1838de4322a63b25bc0224"} Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.823787 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-c8kkp" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.833945 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-p8tj2" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.833969 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-c8kkp" event={"ID":"969137a9-7e00-4472-8582-8008c5647750","Type":"ContainerDied","Data":"e699e02e3c2b164afc9b7797efb1349a3e100416edf93524b4369fab0716a0ee"} Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.834028 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e699e02e3c2b164afc9b7797efb1349a3e100416edf93524b4369fab0716a0ee" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.833970 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.850519 4964 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="5ca73069-dcdb-4be1-bb50-3455eed66412" podUID="0e34537c-4b0f-4683-bf8a-3b56e44424b1" Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.875749 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-p8tj2"] Oct 04 02:57:23 crc kubenswrapper[4964]: I1004 02:57:23.891363 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-p8tj2"] Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.053150 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 02:57:24 crc kubenswrapper[4964]: E1004 02:57:24.053591 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488b1daa-7288-46cf-b351-2d6cec22c917" containerName="init" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.053634 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="488b1daa-7288-46cf-b351-2d6cec22c917" containerName="init" Oct 04 02:57:24 crc kubenswrapper[4964]: E1004 02:57:24.053660 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488b1daa-7288-46cf-b351-2d6cec22c917" containerName="dnsmasq-dns" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.053668 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="488b1daa-7288-46cf-b351-2d6cec22c917" containerName="dnsmasq-dns" Oct 04 02:57:24 crc kubenswrapper[4964]: E1004 02:57:24.053693 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969137a9-7e00-4472-8582-8008c5647750" containerName="cinder-db-sync" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.053702 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="969137a9-7e00-4472-8582-8008c5647750" containerName="cinder-db-sync" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.053938 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="488b1daa-7288-46cf-b351-2d6cec22c917" containerName="dnsmasq-dns" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.053967 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="969137a9-7e00-4472-8582-8008c5647750" containerName="cinder-db-sync" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.055051 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.060187 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.060479 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.060549 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.060560 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6fpbz" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.077571 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.121146 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-2rr95"] Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.122847 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.146194 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.146262 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-899nx\" (UniqueName: \"kubernetes.io/projected/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-kube-api-access-899nx\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.146300 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.146325 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.146349 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.146375 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.161559 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-2rr95"] Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.245068 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.246409 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.248488 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.249377 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.249404 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.249431 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.249460 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.249493 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.249520 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.249552 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-config\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.249574 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.249605 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.249666 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-899nx\" (UniqueName: \"kubernetes.io/projected/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-kube-api-access-899nx\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.249686 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbrdt\" (UniqueName: \"kubernetes.io/projected/697e704a-98d7-40a5-b685-d46635ebe33d-kube-api-access-jbrdt\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.249919 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.250032 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.255746 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.257960 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.261460 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.269379 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.277581 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-899nx\" (UniqueName: \"kubernetes.io/projected/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-kube-api-access-899nx\") pod \"cinder-scheduler-0\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.352691 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2v6b\" (UniqueName: \"kubernetes.io/projected/9b83b7b2-baec-4044-919f-53f040d51d86-kube-api-access-w2v6b\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.352790 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-scripts\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.352809 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-config-data-custom\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.352862 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbrdt\" (UniqueName: \"kubernetes.io/projected/697e704a-98d7-40a5-b685-d46635ebe33d-kube-api-access-jbrdt\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.353003 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.353114 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b83b7b2-baec-4044-919f-53f040d51d86-logs\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.353201 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.353242 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.353269 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b83b7b2-baec-4044-919f-53f040d51d86-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.353308 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-config\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.353338 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-config-data\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.353361 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.354329 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.354432 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.354605 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-config\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.355027 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.374230 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbrdt\" (UniqueName: \"kubernetes.io/projected/697e704a-98d7-40a5-b685-d46635ebe33d-kube-api-access-jbrdt\") pod \"dnsmasq-dns-6d97fcdd8f-2rr95\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.393568 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.450859 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.454192 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-scripts\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.454223 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-config-data-custom\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.454299 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b83b7b2-baec-4044-919f-53f040d51d86-logs\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.454335 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.454353 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b83b7b2-baec-4044-919f-53f040d51d86-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.454375 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-config-data\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.454409 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2v6b\" (UniqueName: \"kubernetes.io/projected/9b83b7b2-baec-4044-919f-53f040d51d86-kube-api-access-w2v6b\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.454710 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b83b7b2-baec-4044-919f-53f040d51d86-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.456957 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b83b7b2-baec-4044-919f-53f040d51d86-logs\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.457023 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-scripts\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.458514 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.463051 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-config-data\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.463293 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-config-data-custom\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.471030 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2v6b\" (UniqueName: \"kubernetes.io/projected/9b83b7b2-baec-4044-919f-53f040d51d86-kube-api-access-w2v6b\") pod \"cinder-api-0\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.648931 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.745143 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-2rr95"] Oct 04 02:57:24 crc kubenswrapper[4964]: W1004 02:57:24.763506 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod697e704a_98d7_40a5_b685_d46635ebe33d.slice/crio-a9c4e4af205c3af8ba3bd41b558c1cee566634756fd910b5b3f718eabf547325 WatchSource:0}: Error finding container a9c4e4af205c3af8ba3bd41b558c1cee566634756fd910b5b3f718eabf547325: Status 404 returned error can't find the container with id a9c4e4af205c3af8ba3bd41b558c1cee566634756fd910b5b3f718eabf547325 Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.804002 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.880396 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="488b1daa-7288-46cf-b351-2d6cec22c917" path="/var/lib/kubelet/pods/488b1daa-7288-46cf-b351-2d6cec22c917/volumes" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.889681 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ca73069-dcdb-4be1-bb50-3455eed66412" path="/var/lib/kubelet/pods/5ca73069-dcdb-4be1-bb50-3455eed66412/volumes" Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.890038 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3","Type":"ContainerStarted","Data":"80b0a4895b91afd52b8a717ef2cc2511d4b44e2bec461881742c0604937df677"} Oct 04 02:57:24 crc kubenswrapper[4964]: I1004 02:57:24.890059 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" event={"ID":"697e704a-98d7-40a5-b685-d46635ebe33d","Type":"ContainerStarted","Data":"a9c4e4af205c3af8ba3bd41b558c1cee566634756fd910b5b3f718eabf547325"} Oct 04 02:57:25 crc kubenswrapper[4964]: I1004 02:57:25.223236 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 04 02:57:25 crc kubenswrapper[4964]: W1004 02:57:25.246869 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b83b7b2_baec_4044_919f_53f040d51d86.slice/crio-234a9789c9a64e4254a79dba0992b4c63542a9ad8fad4df4f724cf748db3cf28 WatchSource:0}: Error finding container 234a9789c9a64e4254a79dba0992b4c63542a9ad8fad4df4f724cf748db3cf28: Status 404 returned error can't find the container with id 234a9789c9a64e4254a79dba0992b4c63542a9ad8fad4df4f724cf748db3cf28 Oct 04 02:57:25 crc kubenswrapper[4964]: I1004 02:57:25.896036 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9b83b7b2-baec-4044-919f-53f040d51d86","Type":"ContainerStarted","Data":"234a9789c9a64e4254a79dba0992b4c63542a9ad8fad4df4f724cf748db3cf28"} Oct 04 02:57:25 crc kubenswrapper[4964]: I1004 02:57:25.901640 4964 generic.go:334] "Generic (PLEG): container finished" podID="697e704a-98d7-40a5-b685-d46635ebe33d" containerID="e07ffb6d953f869c63d66b38bc1b8dd4160d67531d04cab9e12a04cadc9012ea" exitCode=0 Oct 04 02:57:25 crc kubenswrapper[4964]: I1004 02:57:25.901691 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" event={"ID":"697e704a-98d7-40a5-b685-d46635ebe33d","Type":"ContainerDied","Data":"e07ffb6d953f869c63d66b38bc1b8dd4160d67531d04cab9e12a04cadc9012ea"} Oct 04 02:57:26 crc kubenswrapper[4964]: I1004 02:57:26.251509 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 04 02:57:26 crc kubenswrapper[4964]: I1004 02:57:26.931065 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9b83b7b2-baec-4044-919f-53f040d51d86","Type":"ContainerStarted","Data":"b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248"} Oct 04 02:57:26 crc kubenswrapper[4964]: I1004 02:57:26.936244 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3","Type":"ContainerStarted","Data":"17f26b5f3585a9f8e60d48a8e9abbfdb3c67917c3b42841e8251bed80d7c5380"} Oct 04 02:57:26 crc kubenswrapper[4964]: I1004 02:57:26.944520 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" event={"ID":"697e704a-98d7-40a5-b685-d46635ebe33d","Type":"ContainerStarted","Data":"cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3"} Oct 04 02:57:26 crc kubenswrapper[4964]: I1004 02:57:26.944746 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:26 crc kubenswrapper[4964]: I1004 02:57:26.974485 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" podStartSLOduration=2.974464984 podStartE2EDuration="2.974464984s" podCreationTimestamp="2025-10-04 02:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:57:26.966153594 +0000 UTC m=+1026.863112232" watchObservedRunningTime="2025-10-04 02:57:26.974464984 +0000 UTC m=+1026.871423622" Oct 04 02:57:27 crc kubenswrapper[4964]: I1004 02:57:27.958147 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9b83b7b2-baec-4044-919f-53f040d51d86","Type":"ContainerStarted","Data":"98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9"} Oct 04 02:57:27 crc kubenswrapper[4964]: I1004 02:57:27.958637 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 04 02:57:27 crc kubenswrapper[4964]: I1004 02:57:27.958291 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9b83b7b2-baec-4044-919f-53f040d51d86" containerName="cinder-api" containerID="cri-o://98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9" gracePeriod=30 Oct 04 02:57:27 crc kubenswrapper[4964]: I1004 02:57:27.958248 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9b83b7b2-baec-4044-919f-53f040d51d86" containerName="cinder-api-log" containerID="cri-o://b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248" gracePeriod=30 Oct 04 02:57:27 crc kubenswrapper[4964]: I1004 02:57:27.968679 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3","Type":"ContainerStarted","Data":"e79530df5d637bd704985cf9dc5575877412ba4095642b70520b7dfe200658e5"} Oct 04 02:57:27 crc kubenswrapper[4964]: I1004 02:57:27.984422 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.9843984040000002 podStartE2EDuration="3.984398404s" podCreationTimestamp="2025-10-04 02:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:57:27.981266588 +0000 UTC m=+1027.878225226" watchObservedRunningTime="2025-10-04 02:57:27.984398404 +0000 UTC m=+1027.881357042" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.015816 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.118559069 podStartE2EDuration="4.015792844s" podCreationTimestamp="2025-10-04 02:57:24 +0000 UTC" firstStartedPulling="2025-10-04 02:57:24.853653755 +0000 UTC m=+1024.750612393" lastFinishedPulling="2025-10-04 02:57:25.75088753 +0000 UTC m=+1025.647846168" observedRunningTime="2025-10-04 02:57:28.005636948 +0000 UTC m=+1027.902595606" watchObservedRunningTime="2025-10-04 02:57:28.015792844 +0000 UTC m=+1027.912751482" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.582328 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.636651 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b83b7b2-baec-4044-919f-53f040d51d86-logs\") pod \"9b83b7b2-baec-4044-919f-53f040d51d86\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.636750 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-config-data-custom\") pod \"9b83b7b2-baec-4044-919f-53f040d51d86\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.636813 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-config-data\") pod \"9b83b7b2-baec-4044-919f-53f040d51d86\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.636838 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2v6b\" (UniqueName: \"kubernetes.io/projected/9b83b7b2-baec-4044-919f-53f040d51d86-kube-api-access-w2v6b\") pod \"9b83b7b2-baec-4044-919f-53f040d51d86\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.636868 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-scripts\") pod \"9b83b7b2-baec-4044-919f-53f040d51d86\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.636914 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b83b7b2-baec-4044-919f-53f040d51d86-etc-machine-id\") pod \"9b83b7b2-baec-4044-919f-53f040d51d86\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.636955 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-combined-ca-bundle\") pod \"9b83b7b2-baec-4044-919f-53f040d51d86\" (UID: \"9b83b7b2-baec-4044-919f-53f040d51d86\") " Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.637177 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b83b7b2-baec-4044-919f-53f040d51d86-logs" (OuterVolumeSpecName: "logs") pod "9b83b7b2-baec-4044-919f-53f040d51d86" (UID: "9b83b7b2-baec-4044-919f-53f040d51d86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.637462 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b83b7b2-baec-4044-919f-53f040d51d86-logs\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.637543 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b83b7b2-baec-4044-919f-53f040d51d86-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9b83b7b2-baec-4044-919f-53f040d51d86" (UID: "9b83b7b2-baec-4044-919f-53f040d51d86"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.641897 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b83b7b2-baec-4044-919f-53f040d51d86-kube-api-access-w2v6b" (OuterVolumeSpecName: "kube-api-access-w2v6b") pod "9b83b7b2-baec-4044-919f-53f040d51d86" (UID: "9b83b7b2-baec-4044-919f-53f040d51d86"). InnerVolumeSpecName "kube-api-access-w2v6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.643019 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-scripts" (OuterVolumeSpecName: "scripts") pod "9b83b7b2-baec-4044-919f-53f040d51d86" (UID: "9b83b7b2-baec-4044-919f-53f040d51d86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.648434 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9b83b7b2-baec-4044-919f-53f040d51d86" (UID: "9b83b7b2-baec-4044-919f-53f040d51d86"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.674956 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b83b7b2-baec-4044-919f-53f040d51d86" (UID: "9b83b7b2-baec-4044-919f-53f040d51d86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.716887 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-config-data" (OuterVolumeSpecName: "config-data") pod "9b83b7b2-baec-4044-919f-53f040d51d86" (UID: "9b83b7b2-baec-4044-919f-53f040d51d86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.739522 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.739549 4964 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b83b7b2-baec-4044-919f-53f040d51d86-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.739561 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.739570 4964 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.739578 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b83b7b2-baec-4044-919f-53f040d51d86-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.739587 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2v6b\" (UniqueName: \"kubernetes.io/projected/9b83b7b2-baec-4044-919f-53f040d51d86-kube-api-access-w2v6b\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.978192 4964 generic.go:334] "Generic (PLEG): container finished" podID="9b83b7b2-baec-4044-919f-53f040d51d86" containerID="98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9" exitCode=0 Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.978228 4964 generic.go:334] "Generic (PLEG): container finished" podID="9b83b7b2-baec-4044-919f-53f040d51d86" containerID="b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248" exitCode=143 Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.978310 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.978426 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9b83b7b2-baec-4044-919f-53f040d51d86","Type":"ContainerDied","Data":"98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9"} Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.978472 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9b83b7b2-baec-4044-919f-53f040d51d86","Type":"ContainerDied","Data":"b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248"} Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.978495 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9b83b7b2-baec-4044-919f-53f040d51d86","Type":"ContainerDied","Data":"234a9789c9a64e4254a79dba0992b4c63542a9ad8fad4df4f724cf748db3cf28"} Oct 04 02:57:28 crc kubenswrapper[4964]: I1004 02:57:28.978523 4964 scope.go:117] "RemoveContainer" containerID="98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.004497 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.017181 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.024510 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 04 02:57:29 crc kubenswrapper[4964]: E1004 02:57:29.024853 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b83b7b2-baec-4044-919f-53f040d51d86" containerName="cinder-api" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.024868 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b83b7b2-baec-4044-919f-53f040d51d86" containerName="cinder-api" Oct 04 02:57:29 crc kubenswrapper[4964]: E1004 02:57:29.024888 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b83b7b2-baec-4044-919f-53f040d51d86" containerName="cinder-api-log" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.024894 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b83b7b2-baec-4044-919f-53f040d51d86" containerName="cinder-api-log" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.025062 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b83b7b2-baec-4044-919f-53f040d51d86" containerName="cinder-api" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.025089 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b83b7b2-baec-4044-919f-53f040d51d86" containerName="cinder-api-log" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.025944 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.028758 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.029171 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.029318 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.033124 4964 scope.go:117] "RemoveContainer" containerID="b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.039114 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.059395 4964 scope.go:117] "RemoveContainer" containerID="98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9" Oct 04 02:57:29 crc kubenswrapper[4964]: E1004 02:57:29.060208 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9\": container with ID starting with 98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9 not found: ID does not exist" containerID="98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.060252 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9"} err="failed to get container status \"98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9\": rpc error: code = NotFound desc = could not find container \"98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9\": container with ID starting with 98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9 not found: ID does not exist" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.060277 4964 scope.go:117] "RemoveContainer" containerID="b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248" Oct 04 02:57:29 crc kubenswrapper[4964]: E1004 02:57:29.063559 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248\": container with ID starting with b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248 not found: ID does not exist" containerID="b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.063601 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248"} err="failed to get container status \"b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248\": rpc error: code = NotFound desc = could not find container \"b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248\": container with ID starting with b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248 not found: ID does not exist" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.063644 4964 scope.go:117] "RemoveContainer" containerID="98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.063966 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9"} err="failed to get container status \"98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9\": rpc error: code = NotFound desc = could not find container \"98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9\": container with ID starting with 98c6f3fd5d96cc8a9c7befb500f8071b0e8447cf3d95ac91f28cc8348b86e9a9 not found: ID does not exist" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.063991 4964 scope.go:117] "RemoveContainer" containerID="b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.064652 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248"} err="failed to get container status \"b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248\": rpc error: code = NotFound desc = could not find container \"b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248\": container with ID starting with b1e5af519cfc9cea58a4655910ea6f1dce10fc94fb503e62bac84a9bae668248 not found: ID does not exist" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.147567 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-config-data-custom\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.147713 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-logs\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.147867 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.147941 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-scripts\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.148055 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.148084 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.148130 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-config-data\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.148198 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4pk7\" (UniqueName: \"kubernetes.io/projected/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-kube-api-access-k4pk7\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.148216 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.250146 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4pk7\" (UniqueName: \"kubernetes.io/projected/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-kube-api-access-k4pk7\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.250481 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.250534 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-config-data-custom\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.250590 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-logs\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.250591 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.250674 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.250714 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-scripts\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.250772 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.250797 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.250835 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-config-data\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.251520 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-logs\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.256339 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-scripts\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.256565 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-config-data\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.257230 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-config-data-custom\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.266967 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.267905 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.271121 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.271761 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4pk7\" (UniqueName: \"kubernetes.io/projected/1a05e5d9-10e1-44ab-88bf-c5e04a6af16c-kube-api-access-k4pk7\") pod \"cinder-api-0\" (UID: \"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c\") " pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.359325 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.394443 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 04 02:57:29 crc kubenswrapper[4964]: I1004 02:57:29.809910 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 04 02:57:30 crc kubenswrapper[4964]: I1004 02:57:30.862220 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b83b7b2-baec-4044-919f-53f040d51d86" path="/var/lib/kubelet/pods/9b83b7b2-baec-4044-919f-53f040d51d86/volumes" Oct 04 02:57:32 crc kubenswrapper[4964]: I1004 02:57:32.552366 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:32 crc kubenswrapper[4964]: I1004 02:57:32.552739 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="ceilometer-central-agent" containerID="cri-o://2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683" gracePeriod=30 Oct 04 02:57:32 crc kubenswrapper[4964]: I1004 02:57:32.552781 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="ceilometer-notification-agent" containerID="cri-o://5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83" gracePeriod=30 Oct 04 02:57:32 crc kubenswrapper[4964]: I1004 02:57:32.552822 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="sg-core" containerID="cri-o://2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414" gracePeriod=30 Oct 04 02:57:32 crc kubenswrapper[4964]: I1004 02:57:32.552855 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="proxy-httpd" containerID="cri-o://b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809" gracePeriod=30 Oct 04 02:57:32 crc kubenswrapper[4964]: I1004 02:57:32.561012 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 04 02:57:33 crc kubenswrapper[4964]: I1004 02:57:33.026013 4964 generic.go:334] "Generic (PLEG): container finished" podID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerID="b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809" exitCode=0 Oct 04 02:57:33 crc kubenswrapper[4964]: I1004 02:57:33.026248 4964 generic.go:334] "Generic (PLEG): container finished" podID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerID="2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414" exitCode=2 Oct 04 02:57:33 crc kubenswrapper[4964]: I1004 02:57:33.026256 4964 generic.go:334] "Generic (PLEG): container finished" podID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerID="2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683" exitCode=0 Oct 04 02:57:33 crc kubenswrapper[4964]: I1004 02:57:33.026056 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6fb06b3-f829-41c8-a37d-90f6d38ee79f","Type":"ContainerDied","Data":"b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809"} Oct 04 02:57:33 crc kubenswrapper[4964]: I1004 02:57:33.026294 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6fb06b3-f829-41c8-a37d-90f6d38ee79f","Type":"ContainerDied","Data":"2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414"} Oct 04 02:57:33 crc kubenswrapper[4964]: I1004 02:57:33.026313 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6fb06b3-f829-41c8-a37d-90f6d38ee79f","Type":"ContainerDied","Data":"2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683"} Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.302461 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-km2jg"] Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.307361 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-km2jg" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.321368 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-km2jg"] Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.407739 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-bxqjz"] Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.415033 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bxqjz" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.429781 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bxqjz"] Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.449300 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdzc9\" (UniqueName: \"kubernetes.io/projected/a64aceca-7ea4-4919-ba95-1c2a0349361b-kube-api-access-jdzc9\") pod \"nova-api-db-create-km2jg\" (UID: \"a64aceca-7ea4-4919-ba95-1c2a0349361b\") " pod="openstack/nova-api-db-create-km2jg" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.452492 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.501399 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-r2wfw"] Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.506718 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" podUID="ee10b681-dd61-4ff3-a697-31cf616048cb" containerName="dnsmasq-dns" containerID="cri-o://79450f558d10597308805369374df227f15cab75d6462dd1ed0276dccc35a794" gracePeriod=10 Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.550817 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w9hp\" (UniqueName: \"kubernetes.io/projected/59df1870-e2cf-41d0-9fc9-185801b5fd6f-kube-api-access-5w9hp\") pod \"nova-cell0-db-create-bxqjz\" (UID: \"59df1870-e2cf-41d0-9fc9-185801b5fd6f\") " pod="openstack/nova-cell0-db-create-bxqjz" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.550911 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdzc9\" (UniqueName: \"kubernetes.io/projected/a64aceca-7ea4-4919-ba95-1c2a0349361b-kube-api-access-jdzc9\") pod \"nova-api-db-create-km2jg\" (UID: \"a64aceca-7ea4-4919-ba95-1c2a0349361b\") " pod="openstack/nova-api-db-create-km2jg" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.574721 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdzc9\" (UniqueName: \"kubernetes.io/projected/a64aceca-7ea4-4919-ba95-1c2a0349361b-kube-api-access-jdzc9\") pod \"nova-api-db-create-km2jg\" (UID: \"a64aceca-7ea4-4919-ba95-1c2a0349361b\") " pod="openstack/nova-api-db-create-km2jg" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.596243 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-6sn6p"] Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.597228 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6sn6p" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.645150 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6sn6p"] Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.652466 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w9hp\" (UniqueName: \"kubernetes.io/projected/59df1870-e2cf-41d0-9fc9-185801b5fd6f-kube-api-access-5w9hp\") pod \"nova-cell0-db-create-bxqjz\" (UID: \"59df1870-e2cf-41d0-9fc9-185801b5fd6f\") " pod="openstack/nova-cell0-db-create-bxqjz" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.652579 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcqrr\" (UniqueName: \"kubernetes.io/projected/51b046b9-b373-4654-9f0f-ff28fc2d754c-kube-api-access-wcqrr\") pod \"nova-cell1-db-create-6sn6p\" (UID: \"51b046b9-b373-4654-9f0f-ff28fc2d754c\") " pod="openstack/nova-cell1-db-create-6sn6p" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.672891 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-km2jg" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.675011 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w9hp\" (UniqueName: \"kubernetes.io/projected/59df1870-e2cf-41d0-9fc9-185801b5fd6f-kube-api-access-5w9hp\") pod \"nova-cell0-db-create-bxqjz\" (UID: \"59df1870-e2cf-41d0-9fc9-185801b5fd6f\") " pod="openstack/nova-cell0-db-create-bxqjz" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.734584 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bxqjz" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.759600 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcqrr\" (UniqueName: \"kubernetes.io/projected/51b046b9-b373-4654-9f0f-ff28fc2d754c-kube-api-access-wcqrr\") pod \"nova-cell1-db-create-6sn6p\" (UID: \"51b046b9-b373-4654-9f0f-ff28fc2d754c\") " pod="openstack/nova-cell1-db-create-6sn6p" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.775059 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcqrr\" (UniqueName: \"kubernetes.io/projected/51b046b9-b373-4654-9f0f-ff28fc2d754c-kube-api-access-wcqrr\") pod \"nova-cell1-db-create-6sn6p\" (UID: \"51b046b9-b373-4654-9f0f-ff28fc2d754c\") " pod="openstack/nova-cell1-db-create-6sn6p" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.798569 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.838779 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 02:57:34 crc kubenswrapper[4964]: I1004 02:57:34.970323 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6sn6p" Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.043346 4964 generic.go:334] "Generic (PLEG): container finished" podID="ee10b681-dd61-4ff3-a697-31cf616048cb" containerID="79450f558d10597308805369374df227f15cab75d6462dd1ed0276dccc35a794" exitCode=0 Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.043527 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" containerName="cinder-scheduler" containerID="cri-o://17f26b5f3585a9f8e60d48a8e9abbfdb3c67917c3b42841e8251bed80d7c5380" gracePeriod=30 Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.043587 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" event={"ID":"ee10b681-dd61-4ff3-a697-31cf616048cb","Type":"ContainerDied","Data":"79450f558d10597308805369374df227f15cab75d6462dd1ed0276dccc35a794"} Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.043888 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" containerName="probe" containerID="cri-o://e79530df5d637bd704985cf9dc5575877412ba4095642b70520b7dfe200658e5" gracePeriod=30 Oct 04 02:57:35 crc kubenswrapper[4964]: W1004 02:57:35.248025 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a05e5d9_10e1_44ab_88bf_c5e04a6af16c.slice/crio-6c1b7642e189d286c775fd889eec21d5d3f771a4a415a401742b829005596c0a WatchSource:0}: Error finding container 6c1b7642e189d286c775fd889eec21d5d3f771a4a415a401742b829005596c0a: Status 404 returned error can't find the container with id 6c1b7642e189d286c775fd889eec21d5d3f771a4a415a401742b829005596c0a Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.729125 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.777705 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x29dn\" (UniqueName: \"kubernetes.io/projected/ee10b681-dd61-4ff3-a697-31cf616048cb-kube-api-access-x29dn\") pod \"ee10b681-dd61-4ff3-a697-31cf616048cb\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.777995 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-config\") pod \"ee10b681-dd61-4ff3-a697-31cf616048cb\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.778025 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-ovsdbserver-nb\") pod \"ee10b681-dd61-4ff3-a697-31cf616048cb\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.778132 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-dns-svc\") pod \"ee10b681-dd61-4ff3-a697-31cf616048cb\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.778208 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-ovsdbserver-sb\") pod \"ee10b681-dd61-4ff3-a697-31cf616048cb\" (UID: \"ee10b681-dd61-4ff3-a697-31cf616048cb\") " Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.783071 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee10b681-dd61-4ff3-a697-31cf616048cb-kube-api-access-x29dn" (OuterVolumeSpecName: "kube-api-access-x29dn") pod "ee10b681-dd61-4ff3-a697-31cf616048cb" (UID: "ee10b681-dd61-4ff3-a697-31cf616048cb"). InnerVolumeSpecName "kube-api-access-x29dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:35 crc kubenswrapper[4964]: W1004 02:57:35.823600 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59df1870_e2cf_41d0_9fc9_185801b5fd6f.slice/crio-f1b8bf0a5ad0693db61d6c51be60bb413cb99b12807ac9e7087a9fc928149959 WatchSource:0}: Error finding container f1b8bf0a5ad0693db61d6c51be60bb413cb99b12807ac9e7087a9fc928149959: Status 404 returned error can't find the container with id f1b8bf0a5ad0693db61d6c51be60bb413cb99b12807ac9e7087a9fc928149959 Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.824550 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bxqjz"] Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.845220 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee10b681-dd61-4ff3-a697-31cf616048cb" (UID: "ee10b681-dd61-4ff3-a697-31cf616048cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.852126 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee10b681-dd61-4ff3-a697-31cf616048cb" (UID: "ee10b681-dd61-4ff3-a697-31cf616048cb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.866012 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee10b681-dd61-4ff3-a697-31cf616048cb" (UID: "ee10b681-dd61-4ff3-a697-31cf616048cb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.866520 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-config" (OuterVolumeSpecName: "config") pod "ee10b681-dd61-4ff3-a697-31cf616048cb" (UID: "ee10b681-dd61-4ff3-a697-31cf616048cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.879689 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x29dn\" (UniqueName: \"kubernetes.io/projected/ee10b681-dd61-4ff3-a697-31cf616048cb-kube-api-access-x29dn\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.879720 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.879730 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.879739 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.879747 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee10b681-dd61-4ff3-a697-31cf616048cb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.899041 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6sn6p"] Oct 04 02:57:35 crc kubenswrapper[4964]: I1004 02:57:35.914005 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-km2jg"] Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.058024 4964 generic.go:334] "Generic (PLEG): container finished" podID="1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" containerID="e79530df5d637bd704985cf9dc5575877412ba4095642b70520b7dfe200658e5" exitCode=0 Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.058107 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3","Type":"ContainerDied","Data":"e79530df5d637bd704985cf9dc5575877412ba4095642b70520b7dfe200658e5"} Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.059450 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-km2jg" event={"ID":"a64aceca-7ea4-4919-ba95-1c2a0349361b","Type":"ContainerStarted","Data":"5a916d8d3e86ba950bad83e64f84331efd514277091cf4a4c067a086b0dcd9cf"} Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.064387 4964 generic.go:334] "Generic (PLEG): container finished" podID="59df1870-e2cf-41d0-9fc9-185801b5fd6f" containerID="7e5fabbe5bc6bff557d221b7c43be2fca0c1ca264a4a84cacd7b9cc30424c81c" exitCode=0 Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.064442 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bxqjz" event={"ID":"59df1870-e2cf-41d0-9fc9-185801b5fd6f","Type":"ContainerDied","Data":"7e5fabbe5bc6bff557d221b7c43be2fca0c1ca264a4a84cacd7b9cc30424c81c"} Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.064493 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bxqjz" event={"ID":"59df1870-e2cf-41d0-9fc9-185801b5fd6f","Type":"ContainerStarted","Data":"f1b8bf0a5ad0693db61d6c51be60bb413cb99b12807ac9e7087a9fc928149959"} Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.066889 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.066889 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-r2wfw" event={"ID":"ee10b681-dd61-4ff3-a697-31cf616048cb","Type":"ContainerDied","Data":"8a6718b4b77aa343a82e6a5bc48914a581373977c711f6bd080bb5a2e1be7301"} Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.067041 4964 scope.go:117] "RemoveContainer" containerID="79450f558d10597308805369374df227f15cab75d6462dd1ed0276dccc35a794" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.070869 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0e34537c-4b0f-4683-bf8a-3b56e44424b1","Type":"ContainerStarted","Data":"6adcaf40d7296a2c20b7d3208e2e5165a34aa043ce2877f21cc71e8077f1087f"} Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.072925 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6sn6p" event={"ID":"51b046b9-b373-4654-9f0f-ff28fc2d754c","Type":"ContainerStarted","Data":"a47b62e35c9d09c2df3700e4d356fade681ec412187cb398f54a04218a34c243"} Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.074583 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c","Type":"ContainerStarted","Data":"6c1b7642e189d286c775fd889eec21d5d3f771a4a415a401742b829005596c0a"} Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.098172 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.098371 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1b085f5a-7d46-4971-849a-3ff0f69cb179" containerName="kube-state-metrics" containerID="cri-o://405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc" gracePeriod=30 Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.107122 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.131620372 podStartE2EDuration="14.107100789s" podCreationTimestamp="2025-10-04 02:57:22 +0000 UTC" firstStartedPulling="2025-10-04 02:57:23.383550162 +0000 UTC m=+1023.280508800" lastFinishedPulling="2025-10-04 02:57:35.359030579 +0000 UTC m=+1035.255989217" observedRunningTime="2025-10-04 02:57:36.099725861 +0000 UTC m=+1035.996684499" watchObservedRunningTime="2025-10-04 02:57:36.107100789 +0000 UTC m=+1036.004059437" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.118824 4964 scope.go:117] "RemoveContainer" containerID="d09f6246519b0012634e9bd03ce0d8c51c14b444b8c2d0366344d6f6da90ddf5" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.125306 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-r2wfw"] Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.131819 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-r2wfw"] Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.530311 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.596881 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75pht\" (UniqueName: \"kubernetes.io/projected/1b085f5a-7d46-4971-849a-3ff0f69cb179-kube-api-access-75pht\") pod \"1b085f5a-7d46-4971-849a-3ff0f69cb179\" (UID: \"1b085f5a-7d46-4971-849a-3ff0f69cb179\") " Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.600711 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b085f5a-7d46-4971-849a-3ff0f69cb179-kube-api-access-75pht" (OuterVolumeSpecName: "kube-api-access-75pht") pod "1b085f5a-7d46-4971-849a-3ff0f69cb179" (UID: "1b085f5a-7d46-4971-849a-3ff0f69cb179"). InnerVolumeSpecName "kube-api-access-75pht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.699803 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75pht\" (UniqueName: \"kubernetes.io/projected/1b085f5a-7d46-4971-849a-3ff0f69cb179-kube-api-access-75pht\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.793676 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.863667 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee10b681-dd61-4ff3-a697-31cf616048cb" path="/var/lib/kubelet/pods/ee10b681-dd61-4ff3-a697-31cf616048cb/volumes" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.904623 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-run-httpd\") pod \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.904695 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2fnx\" (UniqueName: \"kubernetes.io/projected/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-kube-api-access-b2fnx\") pod \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.904782 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-sg-core-conf-yaml\") pod \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.904834 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-log-httpd\") pod \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.904855 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-scripts\") pod \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.905063 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c6fb06b3-f829-41c8-a37d-90f6d38ee79f" (UID: "c6fb06b3-f829-41c8-a37d-90f6d38ee79f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.905204 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-config-data\") pod \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.905287 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-combined-ca-bundle\") pod \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\" (UID: \"c6fb06b3-f829-41c8-a37d-90f6d38ee79f\") " Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.906478 4964 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.907663 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c6fb06b3-f829-41c8-a37d-90f6d38ee79f" (UID: "c6fb06b3-f829-41c8-a37d-90f6d38ee79f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.923132 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-kube-api-access-b2fnx" (OuterVolumeSpecName: "kube-api-access-b2fnx") pod "c6fb06b3-f829-41c8-a37d-90f6d38ee79f" (UID: "c6fb06b3-f829-41c8-a37d-90f6d38ee79f"). InnerVolumeSpecName "kube-api-access-b2fnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.928458 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c6fb06b3-f829-41c8-a37d-90f6d38ee79f" (UID: "c6fb06b3-f829-41c8-a37d-90f6d38ee79f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.939319 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-scripts" (OuterVolumeSpecName: "scripts") pod "c6fb06b3-f829-41c8-a37d-90f6d38ee79f" (UID: "c6fb06b3-f829-41c8-a37d-90f6d38ee79f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:36 crc kubenswrapper[4964]: I1004 02:57:36.987208 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6fb06b3-f829-41c8-a37d-90f6d38ee79f" (UID: "c6fb06b3-f829-41c8-a37d-90f6d38ee79f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.008652 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2fnx\" (UniqueName: \"kubernetes.io/projected/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-kube-api-access-b2fnx\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.008688 4964 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.008698 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.008708 4964 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.008718 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.019086 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-config-data" (OuterVolumeSpecName: "config-data") pod "c6fb06b3-f829-41c8-a37d-90f6d38ee79f" (UID: "c6fb06b3-f829-41c8-a37d-90f6d38ee79f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.082281 4964 generic.go:334] "Generic (PLEG): container finished" podID="a64aceca-7ea4-4919-ba95-1c2a0349361b" containerID="eee8449bc1dfc078ae6707261e07bbf8d7c0abdfb6122bce2f087847138ce321" exitCode=0 Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.082369 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-km2jg" event={"ID":"a64aceca-7ea4-4919-ba95-1c2a0349361b","Type":"ContainerDied","Data":"eee8449bc1dfc078ae6707261e07bbf8d7c0abdfb6122bce2f087847138ce321"} Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.085410 4964 generic.go:334] "Generic (PLEG): container finished" podID="51b046b9-b373-4654-9f0f-ff28fc2d754c" containerID="bba4ab83986c42295d794c2667eeac32b25ff3dae674743d0a9ea7ba18d2289f" exitCode=0 Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.085479 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6sn6p" event={"ID":"51b046b9-b373-4654-9f0f-ff28fc2d754c","Type":"ContainerDied","Data":"bba4ab83986c42295d794c2667eeac32b25ff3dae674743d0a9ea7ba18d2289f"} Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.088422 4964 generic.go:334] "Generic (PLEG): container finished" podID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerID="5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83" exitCode=0 Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.088503 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6fb06b3-f829-41c8-a37d-90f6d38ee79f","Type":"ContainerDied","Data":"5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83"} Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.088537 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6fb06b3-f829-41c8-a37d-90f6d38ee79f","Type":"ContainerDied","Data":"3b3573a49e40e9e680c316ceec77a20bb15ff9af0a658183f3ec0b20985f4f93"} Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.088558 4964 scope.go:117] "RemoveContainer" containerID="b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.088763 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.090825 4964 generic.go:334] "Generic (PLEG): container finished" podID="1b085f5a-7d46-4971-849a-3ff0f69cb179" containerID="405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc" exitCode=2 Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.090873 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b085f5a-7d46-4971-849a-3ff0f69cb179","Type":"ContainerDied","Data":"405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc"} Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.090896 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b085f5a-7d46-4971-849a-3ff0f69cb179","Type":"ContainerDied","Data":"7fa0051ddcd32f37ca720d4880df1dcf603c77f40d6380a73cb4751d6ebdbc0e"} Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.090904 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.095473 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c","Type":"ContainerStarted","Data":"ec8e4637751b684396bc5e02f6bd2afdd75b92528dfca6f885c84f26afd5209d"} Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.095643 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a05e5d9-10e1-44ab-88bf-c5e04a6af16c","Type":"ContainerStarted","Data":"9ce7d3fb07451723dface3a86f3378cc41cfa83c15517f94136221a0ec0bcf2b"} Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.095796 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.110728 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6fb06b3-f829-41c8-a37d-90f6d38ee79f-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.119846 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.125430 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.142019 4964 scope.go:117] "RemoveContainer" containerID="2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.153894 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.154268 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="proxy-httpd" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154283 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="proxy-httpd" Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.154298 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b085f5a-7d46-4971-849a-3ff0f69cb179" containerName="kube-state-metrics" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154305 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b085f5a-7d46-4971-849a-3ff0f69cb179" containerName="kube-state-metrics" Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.154314 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="ceilometer-notification-agent" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154321 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="ceilometer-notification-agent" Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.154333 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="sg-core" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154339 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="sg-core" Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.154356 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee10b681-dd61-4ff3-a697-31cf616048cb" containerName="init" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154362 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee10b681-dd61-4ff3-a697-31cf616048cb" containerName="init" Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.154373 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee10b681-dd61-4ff3-a697-31cf616048cb" containerName="dnsmasq-dns" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154379 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee10b681-dd61-4ff3-a697-31cf616048cb" containerName="dnsmasq-dns" Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.154391 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="ceilometer-central-agent" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154397 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="ceilometer-central-agent" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154550 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="ceilometer-notification-agent" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154558 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee10b681-dd61-4ff3-a697-31cf616048cb" containerName="dnsmasq-dns" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154568 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="ceilometer-central-agent" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154579 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b085f5a-7d46-4971-849a-3ff0f69cb179" containerName="kube-state-metrics" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154591 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="sg-core" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.154607 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" containerName="proxy-httpd" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.155194 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.168264 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.168371 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.168461 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4q6fg" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.169067 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.169135 4964 scope.go:117] "RemoveContainer" containerID="5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.170000 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=8.169981939 podStartE2EDuration="8.169981939s" podCreationTimestamp="2025-10-04 02:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:57:37.152406854 +0000 UTC m=+1037.049365492" watchObservedRunningTime="2025-10-04 02:57:37.169981939 +0000 UTC m=+1037.066940577" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.210136 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.212805 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/82986d74-7878-4a81-b004-447c44700cd9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"82986d74-7878-4a81-b004-447c44700cd9\") " pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.212886 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-466th\" (UniqueName: \"kubernetes.io/projected/82986d74-7878-4a81-b004-447c44700cd9-kube-api-access-466th\") pod \"kube-state-metrics-0\" (UID: \"82986d74-7878-4a81-b004-447c44700cd9\") " pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.212964 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/82986d74-7878-4a81-b004-447c44700cd9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"82986d74-7878-4a81-b004-447c44700cd9\") " pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.212997 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82986d74-7878-4a81-b004-447c44700cd9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"82986d74-7878-4a81-b004-447c44700cd9\") " pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.213767 4964 scope.go:117] "RemoveContainer" containerID="2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.222672 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.231428 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.233376 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.241610 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.241673 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.256196 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.291496 4964 scope.go:117] "RemoveContainer" containerID="b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809" Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.292566 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809\": container with ID starting with b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809 not found: ID does not exist" containerID="b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.292596 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809"} err="failed to get container status \"b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809\": rpc error: code = NotFound desc = could not find container \"b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809\": container with ID starting with b6be6db21b6bd65722c32979d73053fde62d0a41fec8e0a053874ba1628ce809 not found: ID does not exist" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.292631 4964 scope.go:117] "RemoveContainer" containerID="2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414" Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.294727 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414\": container with ID starting with 2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414 not found: ID does not exist" containerID="2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.294751 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414"} err="failed to get container status \"2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414\": rpc error: code = NotFound desc = could not find container \"2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414\": container with ID starting with 2a08642a447779d9925bb0a405ed1079a40252d0d4eb8b8ee7c4665242c66414 not found: ID does not exist" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.294765 4964 scope.go:117] "RemoveContainer" containerID="5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83" Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.297679 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83\": container with ID starting with 5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83 not found: ID does not exist" containerID="5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.297700 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83"} err="failed to get container status \"5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83\": rpc error: code = NotFound desc = could not find container \"5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83\": container with ID starting with 5d5292f4e1d8bac14fa24400aba9cc89c69adb08f94439cf82818442aaff4d83 not found: ID does not exist" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.297714 4964 scope.go:117] "RemoveContainer" containerID="2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683" Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.299917 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683\": container with ID starting with 2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683 not found: ID does not exist" containerID="2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.299942 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683"} err="failed to get container status \"2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683\": rpc error: code = NotFound desc = could not find container \"2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683\": container with ID starting with 2282a37374a050ba26d66a9c8c8e914d542376999861483d89204f6aae45c683 not found: ID does not exist" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.299956 4964 scope.go:117] "RemoveContainer" containerID="405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.314570 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61ef6680-199b-429e-b9a5-11c888d4274e-run-httpd\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.314659 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/82986d74-7878-4a81-b004-447c44700cd9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"82986d74-7878-4a81-b004-447c44700cd9\") " pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.314689 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-scripts\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.314717 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-466th\" (UniqueName: \"kubernetes.io/projected/82986d74-7878-4a81-b004-447c44700cd9-kube-api-access-466th\") pod \"kube-state-metrics-0\" (UID: \"82986d74-7878-4a81-b004-447c44700cd9\") " pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.314762 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/82986d74-7878-4a81-b004-447c44700cd9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"82986d74-7878-4a81-b004-447c44700cd9\") " pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.314777 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61ef6680-199b-429e-b9a5-11c888d4274e-log-httpd\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.314806 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82986d74-7878-4a81-b004-447c44700cd9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"82986d74-7878-4a81-b004-447c44700cd9\") " pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.314830 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.314848 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns7x2\" (UniqueName: \"kubernetes.io/projected/61ef6680-199b-429e-b9a5-11c888d4274e-kube-api-access-ns7x2\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.314873 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-config-data\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.314900 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.317944 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/82986d74-7878-4a81-b004-447c44700cd9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"82986d74-7878-4a81-b004-447c44700cd9\") " pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.320223 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82986d74-7878-4a81-b004-447c44700cd9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"82986d74-7878-4a81-b004-447c44700cd9\") " pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.321939 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/82986d74-7878-4a81-b004-447c44700cd9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"82986d74-7878-4a81-b004-447c44700cd9\") " pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.326547 4964 scope.go:117] "RemoveContainer" containerID="405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc" Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.327310 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc\": container with ID starting with 405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc not found: ID does not exist" containerID="405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.327330 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc"} err="failed to get container status \"405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc\": rpc error: code = NotFound desc = could not find container \"405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc\": container with ID starting with 405956f03d020a78ba4d1d8314bef91b0df179ab83883eca873527079516afbc not found: ID does not exist" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.331927 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-466th\" (UniqueName: \"kubernetes.io/projected/82986d74-7878-4a81-b004-447c44700cd9-kube-api-access-466th\") pod \"kube-state-metrics-0\" (UID: \"82986d74-7878-4a81-b004-447c44700cd9\") " pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: E1004 02:57:37.365544 4964 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6fb06b3_f829_41c8_a37d_90f6d38ee79f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6fb06b3_f829_41c8_a37d_90f6d38ee79f.slice/crio-3b3573a49e40e9e680c316ceec77a20bb15ff9af0a658183f3ec0b20985f4f93\": RecentStats: unable to find data in memory cache]" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.418364 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-config-data\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.418643 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.418700 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61ef6680-199b-429e-b9a5-11c888d4274e-run-httpd\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.418731 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-scripts\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.418778 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61ef6680-199b-429e-b9a5-11c888d4274e-log-httpd\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.418802 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.418834 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns7x2\" (UniqueName: \"kubernetes.io/projected/61ef6680-199b-429e-b9a5-11c888d4274e-kube-api-access-ns7x2\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.420055 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61ef6680-199b-429e-b9a5-11c888d4274e-log-httpd\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.420597 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61ef6680-199b-429e-b9a5-11c888d4274e-run-httpd\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.423008 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.423084 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-scripts\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.423434 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-config-data\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.431053 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.438276 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns7x2\" (UniqueName: \"kubernetes.io/projected/61ef6680-199b-429e-b9a5-11c888d4274e-kube-api-access-ns7x2\") pod \"ceilometer-0\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.467702 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.468952 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.487275 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bxqjz" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.511481 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.622678 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w9hp\" (UniqueName: \"kubernetes.io/projected/59df1870-e2cf-41d0-9fc9-185801b5fd6f-kube-api-access-5w9hp\") pod \"59df1870-e2cf-41d0-9fc9-185801b5fd6f\" (UID: \"59df1870-e2cf-41d0-9fc9-185801b5fd6f\") " Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.629851 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59df1870-e2cf-41d0-9fc9-185801b5fd6f-kube-api-access-5w9hp" (OuterVolumeSpecName: "kube-api-access-5w9hp") pod "59df1870-e2cf-41d0-9fc9-185801b5fd6f" (UID: "59df1870-e2cf-41d0-9fc9-185801b5fd6f"). InnerVolumeSpecName "kube-api-access-5w9hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:37 crc kubenswrapper[4964]: I1004 02:57:37.726487 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w9hp\" (UniqueName: \"kubernetes.io/projected/59df1870-e2cf-41d0-9fc9-185801b5fd6f-kube-api-access-5w9hp\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.011581 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:38 crc kubenswrapper[4964]: W1004 02:57:38.022598 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61ef6680_199b_429e_b9a5_11c888d4274e.slice/crio-aafa40179a873e27fca0ae93e61ec7e5475cf35af6217431f5e0cee226abb831 WatchSource:0}: Error finding container aafa40179a873e27fca0ae93e61ec7e5475cf35af6217431f5e0cee226abb831: Status 404 returned error can't find the container with id aafa40179a873e27fca0ae93e61ec7e5475cf35af6217431f5e0cee226abb831 Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.062875 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 02:57:38 crc kubenswrapper[4964]: W1004 02:57:38.075068 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82986d74_7878_4a81_b004_447c44700cd9.slice/crio-9b6cab470e4f900b07e64c85d7314ea996c09b5af7ba6a8b7aac08d0788d631b WatchSource:0}: Error finding container 9b6cab470e4f900b07e64c85d7314ea996c09b5af7ba6a8b7aac08d0788d631b: Status 404 returned error can't find the container with id 9b6cab470e4f900b07e64c85d7314ea996c09b5af7ba6a8b7aac08d0788d631b Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.107821 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"82986d74-7878-4a81-b004-447c44700cd9","Type":"ContainerStarted","Data":"9b6cab470e4f900b07e64c85d7314ea996c09b5af7ba6a8b7aac08d0788d631b"} Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.109009 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61ef6680-199b-429e-b9a5-11c888d4274e","Type":"ContainerStarted","Data":"aafa40179a873e27fca0ae93e61ec7e5475cf35af6217431f5e0cee226abb831"} Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.110980 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bxqjz" event={"ID":"59df1870-e2cf-41d0-9fc9-185801b5fd6f","Type":"ContainerDied","Data":"f1b8bf0a5ad0693db61d6c51be60bb413cb99b12807ac9e7087a9fc928149959"} Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.111009 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1b8bf0a5ad0693db61d6c51be60bb413cb99b12807ac9e7087a9fc928149959" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.111061 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bxqjz" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.122136 4964 generic.go:334] "Generic (PLEG): container finished" podID="1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" containerID="17f26b5f3585a9f8e60d48a8e9abbfdb3c67917c3b42841e8251bed80d7c5380" exitCode=0 Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.122320 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3","Type":"ContainerDied","Data":"17f26b5f3585a9f8e60d48a8e9abbfdb3c67917c3b42841e8251bed80d7c5380"} Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.122353 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3","Type":"ContainerDied","Data":"80b0a4895b91afd52b8a717ef2cc2511d4b44e2bec461881742c0604937df677"} Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.122367 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80b0a4895b91afd52b8a717ef2cc2511d4b44e2bec461881742c0604937df677" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.163826 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.234683 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-config-data-custom\") pod \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.234760 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-899nx\" (UniqueName: \"kubernetes.io/projected/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-kube-api-access-899nx\") pod \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.234802 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-config-data\") pod \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.234859 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-combined-ca-bundle\") pod \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.234885 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-etc-machine-id\") pod \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.234956 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-scripts\") pod \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\" (UID: \"1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3\") " Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.237247 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" (UID: "1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.241036 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-scripts" (OuterVolumeSpecName: "scripts") pod "1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" (UID: "1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.241266 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-kube-api-access-899nx" (OuterVolumeSpecName: "kube-api-access-899nx") pod "1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" (UID: "1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3"). InnerVolumeSpecName "kube-api-access-899nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.241588 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" (UID: "1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.283015 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" (UID: "1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.339657 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.339682 4964 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.339691 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-899nx\" (UniqueName: \"kubernetes.io/projected/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-kube-api-access-899nx\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.339702 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.339712 4964 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.340903 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-config-data" (OuterVolumeSpecName: "config-data") pod "1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" (UID: "1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.348552 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6sn6p" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.440724 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcqrr\" (UniqueName: \"kubernetes.io/projected/51b046b9-b373-4654-9f0f-ff28fc2d754c-kube-api-access-wcqrr\") pod \"51b046b9-b373-4654-9f0f-ff28fc2d754c\" (UID: \"51b046b9-b373-4654-9f0f-ff28fc2d754c\") " Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.441057 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.443603 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51b046b9-b373-4654-9f0f-ff28fc2d754c-kube-api-access-wcqrr" (OuterVolumeSpecName: "kube-api-access-wcqrr") pod "51b046b9-b373-4654-9f0f-ff28fc2d754c" (UID: "51b046b9-b373-4654-9f0f-ff28fc2d754c"). InnerVolumeSpecName "kube-api-access-wcqrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.486381 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-km2jg" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.542584 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdzc9\" (UniqueName: \"kubernetes.io/projected/a64aceca-7ea4-4919-ba95-1c2a0349361b-kube-api-access-jdzc9\") pod \"a64aceca-7ea4-4919-ba95-1c2a0349361b\" (UID: \"a64aceca-7ea4-4919-ba95-1c2a0349361b\") " Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.542932 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcqrr\" (UniqueName: \"kubernetes.io/projected/51b046b9-b373-4654-9f0f-ff28fc2d754c-kube-api-access-wcqrr\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.547599 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64aceca-7ea4-4919-ba95-1c2a0349361b-kube-api-access-jdzc9" (OuterVolumeSpecName: "kube-api-access-jdzc9") pod "a64aceca-7ea4-4919-ba95-1c2a0349361b" (UID: "a64aceca-7ea4-4919-ba95-1c2a0349361b"). InnerVolumeSpecName "kube-api-access-jdzc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.644984 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdzc9\" (UniqueName: \"kubernetes.io/projected/a64aceca-7ea4-4919-ba95-1c2a0349361b-kube-api-access-jdzc9\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.858099 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b085f5a-7d46-4971-849a-3ff0f69cb179" path="/var/lib/kubelet/pods/1b085f5a-7d46-4971-849a-3ff0f69cb179/volumes" Oct 04 02:57:38 crc kubenswrapper[4964]: I1004 02:57:38.859599 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6fb06b3-f829-41c8-a37d-90f6d38ee79f" path="/var/lib/kubelet/pods/c6fb06b3-f829-41c8-a37d-90f6d38ee79f/volumes" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.132760 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"82986d74-7878-4a81-b004-447c44700cd9","Type":"ContainerStarted","Data":"c6ec0efa1c44f4a40023830323d48713c525e77b8992f35abb34fb5a8763aeb8"} Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.132843 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.134871 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-km2jg" event={"ID":"a64aceca-7ea4-4919-ba95-1c2a0349361b","Type":"ContainerDied","Data":"5a916d8d3e86ba950bad83e64f84331efd514277091cf4a4c067a086b0dcd9cf"} Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.134915 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a916d8d3e86ba950bad83e64f84331efd514277091cf4a4c067a086b0dcd9cf" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.134934 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-km2jg" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.136832 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61ef6680-199b-429e-b9a5-11c888d4274e","Type":"ContainerStarted","Data":"fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267"} Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.138651 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6sn6p" event={"ID":"51b046b9-b373-4654-9f0f-ff28fc2d754c","Type":"ContainerDied","Data":"a47b62e35c9d09c2df3700e4d356fade681ec412187cb398f54a04218a34c243"} Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.138676 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6sn6p" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.138688 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a47b62e35c9d09c2df3700e4d356fade681ec412187cb398f54a04218a34c243" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.138716 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.160342 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.809841655 podStartE2EDuration="2.160326726s" podCreationTimestamp="2025-10-04 02:57:37 +0000 UTC" firstStartedPulling="2025-10-04 02:57:38.078176131 +0000 UTC m=+1037.975134769" lastFinishedPulling="2025-10-04 02:57:38.428661212 +0000 UTC m=+1038.325619840" observedRunningTime="2025-10-04 02:57:39.156123094 +0000 UTC m=+1039.053081752" watchObservedRunningTime="2025-10-04 02:57:39.160326726 +0000 UTC m=+1039.057285364" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.181791 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.187604 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.205777 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 02:57:39 crc kubenswrapper[4964]: E1004 02:57:39.206125 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" containerName="cinder-scheduler" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.206143 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" containerName="cinder-scheduler" Oct 04 02:57:39 crc kubenswrapper[4964]: E1004 02:57:39.206155 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59df1870-e2cf-41d0-9fc9-185801b5fd6f" containerName="mariadb-database-create" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.206161 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="59df1870-e2cf-41d0-9fc9-185801b5fd6f" containerName="mariadb-database-create" Oct 04 02:57:39 crc kubenswrapper[4964]: E1004 02:57:39.206181 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64aceca-7ea4-4919-ba95-1c2a0349361b" containerName="mariadb-database-create" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.206188 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64aceca-7ea4-4919-ba95-1c2a0349361b" containerName="mariadb-database-create" Oct 04 02:57:39 crc kubenswrapper[4964]: E1004 02:57:39.206209 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51b046b9-b373-4654-9f0f-ff28fc2d754c" containerName="mariadb-database-create" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.206215 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b046b9-b373-4654-9f0f-ff28fc2d754c" containerName="mariadb-database-create" Oct 04 02:57:39 crc kubenswrapper[4964]: E1004 02:57:39.206225 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" containerName="probe" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.206231 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" containerName="probe" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.206370 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64aceca-7ea4-4919-ba95-1c2a0349361b" containerName="mariadb-database-create" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.206384 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="51b046b9-b373-4654-9f0f-ff28fc2d754c" containerName="mariadb-database-create" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.206399 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" containerName="probe" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.206409 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="59df1870-e2cf-41d0-9fc9-185801b5fd6f" containerName="mariadb-database-create" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.206418 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" containerName="cinder-scheduler" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.207276 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.210810 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.220934 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.259497 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8162366-d682-4f52-8402-4eff0411aae0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.259535 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8162366-d682-4f52-8402-4eff0411aae0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.259556 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8162366-d682-4f52-8402-4eff0411aae0-config-data\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.259586 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8162366-d682-4f52-8402-4eff0411aae0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.259745 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8162366-d682-4f52-8402-4eff0411aae0-scripts\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.259886 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdtvq\" (UniqueName: \"kubernetes.io/projected/c8162366-d682-4f52-8402-4eff0411aae0-kube-api-access-gdtvq\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.361698 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdtvq\" (UniqueName: \"kubernetes.io/projected/c8162366-d682-4f52-8402-4eff0411aae0-kube-api-access-gdtvq\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.361837 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8162366-d682-4f52-8402-4eff0411aae0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.361865 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8162366-d682-4f52-8402-4eff0411aae0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.361890 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8162366-d682-4f52-8402-4eff0411aae0-config-data\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.361924 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8162366-d682-4f52-8402-4eff0411aae0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.361975 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8162366-d682-4f52-8402-4eff0411aae0-scripts\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.364737 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8162366-d682-4f52-8402-4eff0411aae0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.365855 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8162366-d682-4f52-8402-4eff0411aae0-scripts\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.366407 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8162366-d682-4f52-8402-4eff0411aae0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.367287 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8162366-d682-4f52-8402-4eff0411aae0-config-data\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.370936 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8162366-d682-4f52-8402-4eff0411aae0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.392891 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdtvq\" (UniqueName: \"kubernetes.io/projected/c8162366-d682-4f52-8402-4eff0411aae0-kube-api-access-gdtvq\") pod \"cinder-scheduler-0\" (UID: \"c8162366-d682-4f52-8402-4eff0411aae0\") " pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.529960 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 04 02:57:39 crc kubenswrapper[4964]: I1004 02:57:39.988106 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 02:57:40 crc kubenswrapper[4964]: I1004 02:57:40.150440 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61ef6680-199b-429e-b9a5-11c888d4274e","Type":"ContainerStarted","Data":"1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7"} Oct 04 02:57:40 crc kubenswrapper[4964]: I1004 02:57:40.153597 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c8162366-d682-4f52-8402-4eff0411aae0","Type":"ContainerStarted","Data":"8964739bd6c54aaa4542be1e3600422d958e1da77cbee8b2f17467f61c22a5c1"} Oct 04 02:57:40 crc kubenswrapper[4964]: I1004 02:57:40.863437 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3" path="/var/lib/kubelet/pods/1a1c8ce8-6c38-4d84-8c82-4de0ba8cd8f3/volumes" Oct 04 02:57:41 crc kubenswrapper[4964]: I1004 02:57:41.162183 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c8162366-d682-4f52-8402-4eff0411aae0","Type":"ContainerStarted","Data":"246ec9836f00b5cca6dbbf24fd22cd39c81179be03c875b4cfc3e8752b3b9ccf"} Oct 04 02:57:41 crc kubenswrapper[4964]: I1004 02:57:41.162226 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c8162366-d682-4f52-8402-4eff0411aae0","Type":"ContainerStarted","Data":"68abab0c5c57b3e61459110f9f8abe753c09599b3c7a0b2ac549916debf84225"} Oct 04 02:57:41 crc kubenswrapper[4964]: I1004 02:57:41.165026 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61ef6680-199b-429e-b9a5-11c888d4274e","Type":"ContainerStarted","Data":"22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728"} Oct 04 02:57:41 crc kubenswrapper[4964]: I1004 02:57:41.186070 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.186052347 podStartE2EDuration="2.186052347s" podCreationTimestamp="2025-10-04 02:57:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:57:41.182212525 +0000 UTC m=+1041.079171173" watchObservedRunningTime="2025-10-04 02:57:41.186052347 +0000 UTC m=+1041.083010985" Oct 04 02:57:42 crc kubenswrapper[4964]: I1004 02:57:42.193819 4964 generic.go:334] "Generic (PLEG): container finished" podID="61ef6680-199b-429e-b9a5-11c888d4274e" containerID="8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413" exitCode=1 Oct 04 02:57:42 crc kubenswrapper[4964]: I1004 02:57:42.194079 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61ef6680-199b-429e-b9a5-11c888d4274e","Type":"ContainerDied","Data":"8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413"} Oct 04 02:57:42 crc kubenswrapper[4964]: I1004 02:57:42.194293 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="ceilometer-central-agent" containerID="cri-o://fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267" gracePeriod=30 Oct 04 02:57:42 crc kubenswrapper[4964]: I1004 02:57:42.194441 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="sg-core" containerID="cri-o://22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728" gracePeriod=30 Oct 04 02:57:42 crc kubenswrapper[4964]: I1004 02:57:42.194506 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="ceilometer-notification-agent" containerID="cri-o://1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7" gracePeriod=30 Oct 04 02:57:42 crc kubenswrapper[4964]: I1004 02:57:42.437099 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.037891 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.127397 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61ef6680-199b-429e-b9a5-11c888d4274e-log-httpd\") pod \"61ef6680-199b-429e-b9a5-11c888d4274e\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.127775 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-config-data\") pod \"61ef6680-199b-429e-b9a5-11c888d4274e\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.127814 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61ef6680-199b-429e-b9a5-11c888d4274e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "61ef6680-199b-429e-b9a5-11c888d4274e" (UID: "61ef6680-199b-429e-b9a5-11c888d4274e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.127882 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-sg-core-conf-yaml\") pod \"61ef6680-199b-429e-b9a5-11c888d4274e\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.127988 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns7x2\" (UniqueName: \"kubernetes.io/projected/61ef6680-199b-429e-b9a5-11c888d4274e-kube-api-access-ns7x2\") pod \"61ef6680-199b-429e-b9a5-11c888d4274e\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.128063 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61ef6680-199b-429e-b9a5-11c888d4274e-run-httpd\") pod \"61ef6680-199b-429e-b9a5-11c888d4274e\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.128192 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-combined-ca-bundle\") pod \"61ef6680-199b-429e-b9a5-11c888d4274e\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.128239 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-scripts\") pod \"61ef6680-199b-429e-b9a5-11c888d4274e\" (UID: \"61ef6680-199b-429e-b9a5-11c888d4274e\") " Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.128465 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61ef6680-199b-429e-b9a5-11c888d4274e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "61ef6680-199b-429e-b9a5-11c888d4274e" (UID: "61ef6680-199b-429e-b9a5-11c888d4274e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.128734 4964 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61ef6680-199b-429e-b9a5-11c888d4274e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.128754 4964 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61ef6680-199b-429e-b9a5-11c888d4274e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.132905 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-scripts" (OuterVolumeSpecName: "scripts") pod "61ef6680-199b-429e-b9a5-11c888d4274e" (UID: "61ef6680-199b-429e-b9a5-11c888d4274e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.133239 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61ef6680-199b-429e-b9a5-11c888d4274e-kube-api-access-ns7x2" (OuterVolumeSpecName: "kube-api-access-ns7x2") pod "61ef6680-199b-429e-b9a5-11c888d4274e" (UID: "61ef6680-199b-429e-b9a5-11c888d4274e"). InnerVolumeSpecName "kube-api-access-ns7x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.158404 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "61ef6680-199b-429e-b9a5-11c888d4274e" (UID: "61ef6680-199b-429e-b9a5-11c888d4274e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.203793 4964 generic.go:334] "Generic (PLEG): container finished" podID="61ef6680-199b-429e-b9a5-11c888d4274e" containerID="22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728" exitCode=2 Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.203822 4964 generic.go:334] "Generic (PLEG): container finished" podID="61ef6680-199b-429e-b9a5-11c888d4274e" containerID="1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7" exitCode=0 Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.203830 4964 generic.go:334] "Generic (PLEG): container finished" podID="61ef6680-199b-429e-b9a5-11c888d4274e" containerID="fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267" exitCode=0 Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.203847 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61ef6680-199b-429e-b9a5-11c888d4274e","Type":"ContainerDied","Data":"22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728"} Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.203871 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61ef6680-199b-429e-b9a5-11c888d4274e","Type":"ContainerDied","Data":"1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7"} Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.203883 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61ef6680-199b-429e-b9a5-11c888d4274e","Type":"ContainerDied","Data":"fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267"} Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.203891 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61ef6680-199b-429e-b9a5-11c888d4274e","Type":"ContainerDied","Data":"aafa40179a873e27fca0ae93e61ec7e5475cf35af6217431f5e0cee226abb831"} Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.203906 4964 scope.go:117] "RemoveContainer" containerID="8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.204019 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.208674 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-config-data" (OuterVolumeSpecName: "config-data") pod "61ef6680-199b-429e-b9a5-11c888d4274e" (UID: "61ef6680-199b-429e-b9a5-11c888d4274e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.219347 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61ef6680-199b-429e-b9a5-11c888d4274e" (UID: "61ef6680-199b-429e-b9a5-11c888d4274e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.230039 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns7x2\" (UniqueName: \"kubernetes.io/projected/61ef6680-199b-429e-b9a5-11c888d4274e-kube-api-access-ns7x2\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.230061 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.230070 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.230079 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.230089 4964 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61ef6680-199b-429e-b9a5-11c888d4274e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.240660 4964 scope.go:117] "RemoveContainer" containerID="22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.258130 4964 scope.go:117] "RemoveContainer" containerID="1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.278735 4964 scope.go:117] "RemoveContainer" containerID="fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.298606 4964 scope.go:117] "RemoveContainer" containerID="8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413" Oct 04 02:57:43 crc kubenswrapper[4964]: E1004 02:57:43.298967 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413\": container with ID starting with 8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413 not found: ID does not exist" containerID="8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.298995 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413"} err="failed to get container status \"8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413\": rpc error: code = NotFound desc = could not find container \"8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413\": container with ID starting with 8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413 not found: ID does not exist" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.299015 4964 scope.go:117] "RemoveContainer" containerID="22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728" Oct 04 02:57:43 crc kubenswrapper[4964]: E1004 02:57:43.299267 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728\": container with ID starting with 22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728 not found: ID does not exist" containerID="22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.299286 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728"} err="failed to get container status \"22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728\": rpc error: code = NotFound desc = could not find container \"22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728\": container with ID starting with 22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728 not found: ID does not exist" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.299300 4964 scope.go:117] "RemoveContainer" containerID="1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7" Oct 04 02:57:43 crc kubenswrapper[4964]: E1004 02:57:43.299733 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7\": container with ID starting with 1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7 not found: ID does not exist" containerID="1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.299803 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7"} err="failed to get container status \"1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7\": rpc error: code = NotFound desc = could not find container \"1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7\": container with ID starting with 1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7 not found: ID does not exist" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.299845 4964 scope.go:117] "RemoveContainer" containerID="fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267" Oct 04 02:57:43 crc kubenswrapper[4964]: E1004 02:57:43.300146 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267\": container with ID starting with fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267 not found: ID does not exist" containerID="fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.300166 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267"} err="failed to get container status \"fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267\": rpc error: code = NotFound desc = could not find container \"fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267\": container with ID starting with fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267 not found: ID does not exist" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.300181 4964 scope.go:117] "RemoveContainer" containerID="8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.300551 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413"} err="failed to get container status \"8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413\": rpc error: code = NotFound desc = could not find container \"8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413\": container with ID starting with 8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413 not found: ID does not exist" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.300567 4964 scope.go:117] "RemoveContainer" containerID="22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.300909 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728"} err="failed to get container status \"22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728\": rpc error: code = NotFound desc = could not find container \"22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728\": container with ID starting with 22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728 not found: ID does not exist" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.300950 4964 scope.go:117] "RemoveContainer" containerID="1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.301321 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7"} err="failed to get container status \"1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7\": rpc error: code = NotFound desc = could not find container \"1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7\": container with ID starting with 1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7 not found: ID does not exist" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.301360 4964 scope.go:117] "RemoveContainer" containerID="fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.319216 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267"} err="failed to get container status \"fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267\": rpc error: code = NotFound desc = could not find container \"fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267\": container with ID starting with fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267 not found: ID does not exist" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.319283 4964 scope.go:117] "RemoveContainer" containerID="8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.319702 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413"} err="failed to get container status \"8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413\": rpc error: code = NotFound desc = could not find container \"8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413\": container with ID starting with 8eb0ebc9a52376322da42ff50ad7f20078539a8c8e055590289021bb2cc82413 not found: ID does not exist" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.319743 4964 scope.go:117] "RemoveContainer" containerID="22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.320131 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728"} err="failed to get container status \"22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728\": rpc error: code = NotFound desc = could not find container \"22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728\": container with ID starting with 22fbe6b160fb647224ccb2d3d5e24791bcb86cdff0bedb9e0cd1b2e2afad2728 not found: ID does not exist" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.320172 4964 scope.go:117] "RemoveContainer" containerID="1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.320548 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7"} err="failed to get container status \"1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7\": rpc error: code = NotFound desc = could not find container \"1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7\": container with ID starting with 1a85beaeb4fca3a4c72b9cc3a72a6ad405e9cd11d476141126b04b437cdd88b7 not found: ID does not exist" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.320686 4964 scope.go:117] "RemoveContainer" containerID="fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.321039 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267"} err="failed to get container status \"fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267\": rpc error: code = NotFound desc = could not find container \"fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267\": container with ID starting with fe2820fd03f88a2d1f9e4a709bd22fc6326c47ab6a97f7bcbd92531b71816267 not found: ID does not exist" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.534384 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.542949 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.578786 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:43 crc kubenswrapper[4964]: E1004 02:57:43.579280 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="ceilometer-notification-agent" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.579308 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="ceilometer-notification-agent" Oct 04 02:57:43 crc kubenswrapper[4964]: E1004 02:57:43.579326 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="proxy-httpd" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.579339 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="proxy-httpd" Oct 04 02:57:43 crc kubenswrapper[4964]: E1004 02:57:43.579373 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="ceilometer-central-agent" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.579387 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="ceilometer-central-agent" Oct 04 02:57:43 crc kubenswrapper[4964]: E1004 02:57:43.579425 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="sg-core" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.579437 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="sg-core" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.579745 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="proxy-httpd" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.579776 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="ceilometer-notification-agent" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.579800 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="ceilometer-central-agent" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.579818 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" containerName="sg-core" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.582679 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.585731 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.589571 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.590202 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.608557 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.635422 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-config-data\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.635685 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8cfa5de-9487-4a6e-8083-4cb498f21506-log-httpd\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.635806 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.635966 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.636119 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-scripts\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.636232 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8cfa5de-9487-4a6e-8083-4cb498f21506-run-httpd\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.636680 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.636845 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cp6b\" (UniqueName: \"kubernetes.io/projected/c8cfa5de-9487-4a6e-8083-4cb498f21506-kube-api-access-7cp6b\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.738794 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.738855 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-scripts\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.738898 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8cfa5de-9487-4a6e-8083-4cb498f21506-run-httpd\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.738953 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.738994 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cp6b\" (UniqueName: \"kubernetes.io/projected/c8cfa5de-9487-4a6e-8083-4cb498f21506-kube-api-access-7cp6b\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.739035 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-config-data\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.739060 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8cfa5de-9487-4a6e-8083-4cb498f21506-log-httpd\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.739093 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.739397 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8cfa5de-9487-4a6e-8083-4cb498f21506-run-httpd\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.739589 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8cfa5de-9487-4a6e-8083-4cb498f21506-log-httpd\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.742826 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.743230 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.743628 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.744358 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-scripts\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.746684 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-config-data\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.755095 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cp6b\" (UniqueName: \"kubernetes.io/projected/c8cfa5de-9487-4a6e-8083-4cb498f21506-kube-api-access-7cp6b\") pod \"ceilometer-0\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " pod="openstack/ceilometer-0" Oct 04 02:57:43 crc kubenswrapper[4964]: I1004 02:57:43.948543 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.258886 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-76d58c8cb5-9l2w8" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.337015 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5cb49d4c48-qflxg"] Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.337217 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5cb49d4c48-qflxg" podUID="36580143-625b-4a5f-9ac1-890853622671" containerName="neutron-api" containerID="cri-o://02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5" gracePeriod=30 Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.337375 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5cb49d4c48-qflxg" podUID="36580143-625b-4a5f-9ac1-890853622671" containerName="neutron-httpd" containerID="cri-o://c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd" gracePeriod=30 Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.431586 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-eba9-account-create-sqtvk"] Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.432598 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eba9-account-create-sqtvk" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.441159 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.443273 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-eba9-account-create-sqtvk"] Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.454162 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnq9g\" (UniqueName: \"kubernetes.io/projected/5da7c2b0-bdcb-434a-a875-02f2d7143a32-kube-api-access-cnq9g\") pod \"nova-cell0-eba9-account-create-sqtvk\" (UID: \"5da7c2b0-bdcb-434a-a875-02f2d7143a32\") " pod="openstack/nova-cell0-eba9-account-create-sqtvk" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.483856 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:44 crc kubenswrapper[4964]: W1004 02:57:44.488606 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8cfa5de_9487_4a6e_8083_4cb498f21506.slice/crio-a3db6dde437b839119d7adb0e6f900473b15309a3968ecca13ec96b0da84254d WatchSource:0}: Error finding container a3db6dde437b839119d7adb0e6f900473b15309a3968ecca13ec96b0da84254d: Status 404 returned error can't find the container with id a3db6dde437b839119d7adb0e6f900473b15309a3968ecca13ec96b0da84254d Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.531017 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.560830 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnq9g\" (UniqueName: \"kubernetes.io/projected/5da7c2b0-bdcb-434a-a875-02f2d7143a32-kube-api-access-cnq9g\") pod \"nova-cell0-eba9-account-create-sqtvk\" (UID: \"5da7c2b0-bdcb-434a-a875-02f2d7143a32\") " pod="openstack/nova-cell0-eba9-account-create-sqtvk" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.584210 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnq9g\" (UniqueName: \"kubernetes.io/projected/5da7c2b0-bdcb-434a-a875-02f2d7143a32-kube-api-access-cnq9g\") pod \"nova-cell0-eba9-account-create-sqtvk\" (UID: \"5da7c2b0-bdcb-434a-a875-02f2d7143a32\") " pod="openstack/nova-cell0-eba9-account-create-sqtvk" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.632560 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b2c0-account-create-6zpbp"] Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.634946 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b2c0-account-create-6zpbp" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.640937 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.643587 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b2c0-account-create-6zpbp"] Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.662867 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blhrn\" (UniqueName: \"kubernetes.io/projected/b7a1304b-a278-4813-9fa1-de46d59f5f87-kube-api-access-blhrn\") pod \"nova-cell1-b2c0-account-create-6zpbp\" (UID: \"b7a1304b-a278-4813-9fa1-de46d59f5f87\") " pod="openstack/nova-cell1-b2c0-account-create-6zpbp" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.753079 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eba9-account-create-sqtvk" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.763779 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blhrn\" (UniqueName: \"kubernetes.io/projected/b7a1304b-a278-4813-9fa1-de46d59f5f87-kube-api-access-blhrn\") pod \"nova-cell1-b2c0-account-create-6zpbp\" (UID: \"b7a1304b-a278-4813-9fa1-de46d59f5f87\") " pod="openstack/nova-cell1-b2c0-account-create-6zpbp" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.780379 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blhrn\" (UniqueName: \"kubernetes.io/projected/b7a1304b-a278-4813-9fa1-de46d59f5f87-kube-api-access-blhrn\") pod \"nova-cell1-b2c0-account-create-6zpbp\" (UID: \"b7a1304b-a278-4813-9fa1-de46d59f5f87\") " pod="openstack/nova-cell1-b2c0-account-create-6zpbp" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.862281 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61ef6680-199b-429e-b9a5-11c888d4274e" path="/var/lib/kubelet/pods/61ef6680-199b-429e-b9a5-11c888d4274e/volumes" Oct 04 02:57:44 crc kubenswrapper[4964]: I1004 02:57:44.952779 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b2c0-account-create-6zpbp" Oct 04 02:57:45 crc kubenswrapper[4964]: I1004 02:57:45.191572 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-eba9-account-create-sqtvk"] Oct 04 02:57:45 crc kubenswrapper[4964]: W1004 02:57:45.201228 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5da7c2b0_bdcb_434a_a875_02f2d7143a32.slice/crio-9acbc12b026976565776676b186eff16d37fc829dd7adefd1dd64770f92b59d3 WatchSource:0}: Error finding container 9acbc12b026976565776676b186eff16d37fc829dd7adefd1dd64770f92b59d3: Status 404 returned error can't find the container with id 9acbc12b026976565776676b186eff16d37fc829dd7adefd1dd64770f92b59d3 Oct 04 02:57:45 crc kubenswrapper[4964]: I1004 02:57:45.224227 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8cfa5de-9487-4a6e-8083-4cb498f21506","Type":"ContainerStarted","Data":"a3db6dde437b839119d7adb0e6f900473b15309a3968ecca13ec96b0da84254d"} Oct 04 02:57:45 crc kubenswrapper[4964]: I1004 02:57:45.225999 4964 generic.go:334] "Generic (PLEG): container finished" podID="36580143-625b-4a5f-9ac1-890853622671" containerID="c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd" exitCode=0 Oct 04 02:57:45 crc kubenswrapper[4964]: I1004 02:57:45.226046 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cb49d4c48-qflxg" event={"ID":"36580143-625b-4a5f-9ac1-890853622671","Type":"ContainerDied","Data":"c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd"} Oct 04 02:57:45 crc kubenswrapper[4964]: I1004 02:57:45.228594 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eba9-account-create-sqtvk" event={"ID":"5da7c2b0-bdcb-434a-a875-02f2d7143a32","Type":"ContainerStarted","Data":"9acbc12b026976565776676b186eff16d37fc829dd7adefd1dd64770f92b59d3"} Oct 04 02:57:45 crc kubenswrapper[4964]: I1004 02:57:45.411508 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b2c0-account-create-6zpbp"] Oct 04 02:57:45 crc kubenswrapper[4964]: I1004 02:57:45.772822 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:46 crc kubenswrapper[4964]: I1004 02:57:46.240199 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8cfa5de-9487-4a6e-8083-4cb498f21506","Type":"ContainerStarted","Data":"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5"} Oct 04 02:57:46 crc kubenswrapper[4964]: I1004 02:57:46.240509 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8cfa5de-9487-4a6e-8083-4cb498f21506","Type":"ContainerStarted","Data":"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678"} Oct 04 02:57:46 crc kubenswrapper[4964]: I1004 02:57:46.242099 4964 generic.go:334] "Generic (PLEG): container finished" podID="5da7c2b0-bdcb-434a-a875-02f2d7143a32" containerID="ed8ec2815bf7580d8357c25af4facf618bc9d71e7a6d6e8afaf17a5506c78a2f" exitCode=0 Oct 04 02:57:46 crc kubenswrapper[4964]: I1004 02:57:46.242199 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eba9-account-create-sqtvk" event={"ID":"5da7c2b0-bdcb-434a-a875-02f2d7143a32","Type":"ContainerDied","Data":"ed8ec2815bf7580d8357c25af4facf618bc9d71e7a6d6e8afaf17a5506c78a2f"} Oct 04 02:57:46 crc kubenswrapper[4964]: I1004 02:57:46.243776 4964 generic.go:334] "Generic (PLEG): container finished" podID="b7a1304b-a278-4813-9fa1-de46d59f5f87" containerID="0c6f5e4885fc7704a61a7e7c551198c3b95a97f3733bc4cf18c5d66621aacde9" exitCode=0 Oct 04 02:57:46 crc kubenswrapper[4964]: I1004 02:57:46.243807 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b2c0-account-create-6zpbp" event={"ID":"b7a1304b-a278-4813-9fa1-de46d59f5f87","Type":"ContainerDied","Data":"0c6f5e4885fc7704a61a7e7c551198c3b95a97f3733bc4cf18c5d66621aacde9"} Oct 04 02:57:46 crc kubenswrapper[4964]: I1004 02:57:46.243853 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b2c0-account-create-6zpbp" event={"ID":"b7a1304b-a278-4813-9fa1-de46d59f5f87","Type":"ContainerStarted","Data":"1c2086d2f2a2fcf07ef78fb382d383b3d02486077ef350df470e6e411c0e1439"} Oct 04 02:57:46 crc kubenswrapper[4964]: I1004 02:57:46.388409 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 04 02:57:47 crc kubenswrapper[4964]: I1004 02:57:47.254176 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8cfa5de-9487-4a6e-8083-4cb498f21506","Type":"ContainerStarted","Data":"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573"} Oct 04 02:57:47 crc kubenswrapper[4964]: I1004 02:57:47.525894 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 04 02:57:48 crc kubenswrapper[4964]: I1004 02:57:48.560805 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b2c0-account-create-6zpbp" Oct 04 02:57:48 crc kubenswrapper[4964]: I1004 02:57:48.566190 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eba9-account-create-sqtvk" Oct 04 02:57:48 crc kubenswrapper[4964]: I1004 02:57:48.653752 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blhrn\" (UniqueName: \"kubernetes.io/projected/b7a1304b-a278-4813-9fa1-de46d59f5f87-kube-api-access-blhrn\") pod \"b7a1304b-a278-4813-9fa1-de46d59f5f87\" (UID: \"b7a1304b-a278-4813-9fa1-de46d59f5f87\") " Oct 04 02:57:48 crc kubenswrapper[4964]: I1004 02:57:48.653974 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnq9g\" (UniqueName: \"kubernetes.io/projected/5da7c2b0-bdcb-434a-a875-02f2d7143a32-kube-api-access-cnq9g\") pod \"5da7c2b0-bdcb-434a-a875-02f2d7143a32\" (UID: \"5da7c2b0-bdcb-434a-a875-02f2d7143a32\") " Oct 04 02:57:48 crc kubenswrapper[4964]: I1004 02:57:48.676176 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da7c2b0-bdcb-434a-a875-02f2d7143a32-kube-api-access-cnq9g" (OuterVolumeSpecName: "kube-api-access-cnq9g") pod "5da7c2b0-bdcb-434a-a875-02f2d7143a32" (UID: "5da7c2b0-bdcb-434a-a875-02f2d7143a32"). InnerVolumeSpecName "kube-api-access-cnq9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:48 crc kubenswrapper[4964]: I1004 02:57:48.678821 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a1304b-a278-4813-9fa1-de46d59f5f87-kube-api-access-blhrn" (OuterVolumeSpecName: "kube-api-access-blhrn") pod "b7a1304b-a278-4813-9fa1-de46d59f5f87" (UID: "b7a1304b-a278-4813-9fa1-de46d59f5f87"). InnerVolumeSpecName "kube-api-access-blhrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:48 crc kubenswrapper[4964]: I1004 02:57:48.756405 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnq9g\" (UniqueName: \"kubernetes.io/projected/5da7c2b0-bdcb-434a-a875-02f2d7143a32-kube-api-access-cnq9g\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:48 crc kubenswrapper[4964]: I1004 02:57:48.756671 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blhrn\" (UniqueName: \"kubernetes.io/projected/b7a1304b-a278-4813-9fa1-de46d59f5f87-kube-api-access-blhrn\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.256000 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.269423 4964 generic.go:334] "Generic (PLEG): container finished" podID="36580143-625b-4a5f-9ac1-890853622671" containerID="02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5" exitCode=0 Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.269499 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cb49d4c48-qflxg" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.269504 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cb49d4c48-qflxg" event={"ID":"36580143-625b-4a5f-9ac1-890853622671","Type":"ContainerDied","Data":"02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5"} Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.269647 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cb49d4c48-qflxg" event={"ID":"36580143-625b-4a5f-9ac1-890853622671","Type":"ContainerDied","Data":"d013e7e2c0be437c4919a8d8b1d8389dd3bdf9404592f986ccaaebde9af57830"} Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.269675 4964 scope.go:117] "RemoveContainer" containerID="c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.271177 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eba9-account-create-sqtvk" event={"ID":"5da7c2b0-bdcb-434a-a875-02f2d7143a32","Type":"ContainerDied","Data":"9acbc12b026976565776676b186eff16d37fc829dd7adefd1dd64770f92b59d3"} Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.271209 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9acbc12b026976565776676b186eff16d37fc829dd7adefd1dd64770f92b59d3" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.271247 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eba9-account-create-sqtvk" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.279887 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b2c0-account-create-6zpbp" event={"ID":"b7a1304b-a278-4813-9fa1-de46d59f5f87","Type":"ContainerDied","Data":"1c2086d2f2a2fcf07ef78fb382d383b3d02486077ef350df470e6e411c0e1439"} Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.279918 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c2086d2f2a2fcf07ef78fb382d383b3d02486077ef350df470e6e411c0e1439" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.279970 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b2c0-account-create-6zpbp" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.286270 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8cfa5de-9487-4a6e-8083-4cb498f21506","Type":"ContainerStarted","Data":"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674"} Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.286402 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="ceilometer-central-agent" containerID="cri-o://90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678" gracePeriod=30 Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.286623 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.286824 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="proxy-httpd" containerID="cri-o://dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674" gracePeriod=30 Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.286871 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="sg-core" containerID="cri-o://1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573" gracePeriod=30 Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.286903 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="ceilometer-notification-agent" containerID="cri-o://ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5" gracePeriod=30 Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.301432 4964 scope.go:117] "RemoveContainer" containerID="02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.317575 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.101781117 podStartE2EDuration="6.31755333s" podCreationTimestamp="2025-10-04 02:57:43 +0000 UTC" firstStartedPulling="2025-10-04 02:57:44.490747554 +0000 UTC m=+1044.387706192" lastFinishedPulling="2025-10-04 02:57:48.706519767 +0000 UTC m=+1048.603478405" observedRunningTime="2025-10-04 02:57:49.317095187 +0000 UTC m=+1049.214053835" watchObservedRunningTime="2025-10-04 02:57:49.31755333 +0000 UTC m=+1049.214511978" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.336088 4964 scope.go:117] "RemoveContainer" containerID="c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd" Oct 04 02:57:49 crc kubenswrapper[4964]: E1004 02:57:49.343400 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd\": container with ID starting with c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd not found: ID does not exist" containerID="c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.343442 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd"} err="failed to get container status \"c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd\": rpc error: code = NotFound desc = could not find container \"c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd\": container with ID starting with c85a475de344e0363af685cd6304da92c1d17331911d5e3e663227c0d00650cd not found: ID does not exist" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.343467 4964 scope.go:117] "RemoveContainer" containerID="02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5" Oct 04 02:57:49 crc kubenswrapper[4964]: E1004 02:57:49.345695 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5\": container with ID starting with 02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5 not found: ID does not exist" containerID="02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.345721 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5"} err="failed to get container status \"02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5\": rpc error: code = NotFound desc = could not find container \"02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5\": container with ID starting with 02f4b96938b7e17d87e507cf1cfe5d538553f1daefe29915e94f98a1594344c5 not found: ID does not exist" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.363090 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-config\") pod \"36580143-625b-4a5f-9ac1-890853622671\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.363266 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-combined-ca-bundle\") pod \"36580143-625b-4a5f-9ac1-890853622671\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.363294 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-httpd-config\") pod \"36580143-625b-4a5f-9ac1-890853622671\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.363782 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-ovndb-tls-certs\") pod \"36580143-625b-4a5f-9ac1-890853622671\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.363860 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpr2p\" (UniqueName: \"kubernetes.io/projected/36580143-625b-4a5f-9ac1-890853622671-kube-api-access-tpr2p\") pod \"36580143-625b-4a5f-9ac1-890853622671\" (UID: \"36580143-625b-4a5f-9ac1-890853622671\") " Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.367491 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36580143-625b-4a5f-9ac1-890853622671-kube-api-access-tpr2p" (OuterVolumeSpecName: "kube-api-access-tpr2p") pod "36580143-625b-4a5f-9ac1-890853622671" (UID: "36580143-625b-4a5f-9ac1-890853622671"). InnerVolumeSpecName "kube-api-access-tpr2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.368359 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "36580143-625b-4a5f-9ac1-890853622671" (UID: "36580143-625b-4a5f-9ac1-890853622671"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.417341 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-config" (OuterVolumeSpecName: "config") pod "36580143-625b-4a5f-9ac1-890853622671" (UID: "36580143-625b-4a5f-9ac1-890853622671"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.417776 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36580143-625b-4a5f-9ac1-890853622671" (UID: "36580143-625b-4a5f-9ac1-890853622671"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.438174 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "36580143-625b-4a5f-9ac1-890853622671" (UID: "36580143-625b-4a5f-9ac1-890853622671"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.466747 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.466809 4964 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.466829 4964 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.466848 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpr2p\" (UniqueName: \"kubernetes.io/projected/36580143-625b-4a5f-9ac1-890853622671-kube-api-access-tpr2p\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.466869 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/36580143-625b-4a5f-9ac1-890853622671-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.617378 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5cb49d4c48-qflxg"] Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.631274 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5cb49d4c48-qflxg"] Oct 04 02:57:49 crc kubenswrapper[4964]: I1004 02:57:49.730106 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.079982 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.179682 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-config-data\") pod \"c8cfa5de-9487-4a6e-8083-4cb498f21506\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.179770 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-combined-ca-bundle\") pod \"c8cfa5de-9487-4a6e-8083-4cb498f21506\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.179811 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cp6b\" (UniqueName: \"kubernetes.io/projected/c8cfa5de-9487-4a6e-8083-4cb498f21506-kube-api-access-7cp6b\") pod \"c8cfa5de-9487-4a6e-8083-4cb498f21506\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.179844 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8cfa5de-9487-4a6e-8083-4cb498f21506-log-httpd\") pod \"c8cfa5de-9487-4a6e-8083-4cb498f21506\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.179885 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8cfa5de-9487-4a6e-8083-4cb498f21506-run-httpd\") pod \"c8cfa5de-9487-4a6e-8083-4cb498f21506\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.179921 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-sg-core-conf-yaml\") pod \"c8cfa5de-9487-4a6e-8083-4cb498f21506\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.179989 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-scripts\") pod \"c8cfa5de-9487-4a6e-8083-4cb498f21506\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.180009 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-ceilometer-tls-certs\") pod \"c8cfa5de-9487-4a6e-8083-4cb498f21506\" (UID: \"c8cfa5de-9487-4a6e-8083-4cb498f21506\") " Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.180667 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8cfa5de-9487-4a6e-8083-4cb498f21506-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c8cfa5de-9487-4a6e-8083-4cb498f21506" (UID: "c8cfa5de-9487-4a6e-8083-4cb498f21506"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.180986 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8cfa5de-9487-4a6e-8083-4cb498f21506-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c8cfa5de-9487-4a6e-8083-4cb498f21506" (UID: "c8cfa5de-9487-4a6e-8083-4cb498f21506"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.181217 4964 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8cfa5de-9487-4a6e-8083-4cb498f21506-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.181230 4964 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8cfa5de-9487-4a6e-8083-4cb498f21506-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.186108 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8cfa5de-9487-4a6e-8083-4cb498f21506-kube-api-access-7cp6b" (OuterVolumeSpecName: "kube-api-access-7cp6b") pod "c8cfa5de-9487-4a6e-8083-4cb498f21506" (UID: "c8cfa5de-9487-4a6e-8083-4cb498f21506"). InnerVolumeSpecName "kube-api-access-7cp6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.188306 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-scripts" (OuterVolumeSpecName: "scripts") pod "c8cfa5de-9487-4a6e-8083-4cb498f21506" (UID: "c8cfa5de-9487-4a6e-8083-4cb498f21506"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.205807 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c8cfa5de-9487-4a6e-8083-4cb498f21506" (UID: "c8cfa5de-9487-4a6e-8083-4cb498f21506"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.233609 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c8cfa5de-9487-4a6e-8083-4cb498f21506" (UID: "c8cfa5de-9487-4a6e-8083-4cb498f21506"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.249719 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8cfa5de-9487-4a6e-8083-4cb498f21506" (UID: "c8cfa5de-9487-4a6e-8083-4cb498f21506"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.266557 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-config-data" (OuterVolumeSpecName: "config-data") pod "c8cfa5de-9487-4a6e-8083-4cb498f21506" (UID: "c8cfa5de-9487-4a6e-8083-4cb498f21506"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.283471 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.283532 4964 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.283553 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.283571 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.283587 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cp6b\" (UniqueName: \"kubernetes.io/projected/c8cfa5de-9487-4a6e-8083-4cb498f21506-kube-api-access-7cp6b\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.283603 4964 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8cfa5de-9487-4a6e-8083-4cb498f21506-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.295619 4964 generic.go:334] "Generic (PLEG): container finished" podID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerID="dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674" exitCode=0 Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.295662 4964 generic.go:334] "Generic (PLEG): container finished" podID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerID="1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573" exitCode=2 Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.295670 4964 generic.go:334] "Generic (PLEG): container finished" podID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerID="ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5" exitCode=0 Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.295677 4964 generic.go:334] "Generic (PLEG): container finished" podID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerID="90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678" exitCode=0 Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.295703 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8cfa5de-9487-4a6e-8083-4cb498f21506","Type":"ContainerDied","Data":"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674"} Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.295742 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.295775 4964 scope.go:117] "RemoveContainer" containerID="dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.295762 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8cfa5de-9487-4a6e-8083-4cb498f21506","Type":"ContainerDied","Data":"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573"} Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.295920 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8cfa5de-9487-4a6e-8083-4cb498f21506","Type":"ContainerDied","Data":"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5"} Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.295936 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8cfa5de-9487-4a6e-8083-4cb498f21506","Type":"ContainerDied","Data":"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678"} Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.295947 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8cfa5de-9487-4a6e-8083-4cb498f21506","Type":"ContainerDied","Data":"a3db6dde437b839119d7adb0e6f900473b15309a3968ecca13ec96b0da84254d"} Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.333129 4964 scope.go:117] "RemoveContainer" containerID="1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.334142 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.346668 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.349808 4964 scope.go:117] "RemoveContainer" containerID="ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.358438 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:50 crc kubenswrapper[4964]: E1004 02:57:50.358759 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36580143-625b-4a5f-9ac1-890853622671" containerName="neutron-api" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.358778 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="36580143-625b-4a5f-9ac1-890853622671" containerName="neutron-api" Oct 04 02:57:50 crc kubenswrapper[4964]: E1004 02:57:50.358798 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da7c2b0-bdcb-434a-a875-02f2d7143a32" containerName="mariadb-account-create" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.358805 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da7c2b0-bdcb-434a-a875-02f2d7143a32" containerName="mariadb-account-create" Oct 04 02:57:50 crc kubenswrapper[4964]: E1004 02:57:50.358816 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="ceilometer-central-agent" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.358823 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="ceilometer-central-agent" Oct 04 02:57:50 crc kubenswrapper[4964]: E1004 02:57:50.358830 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="proxy-httpd" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.358835 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="proxy-httpd" Oct 04 02:57:50 crc kubenswrapper[4964]: E1004 02:57:50.358843 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36580143-625b-4a5f-9ac1-890853622671" containerName="neutron-httpd" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.358849 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="36580143-625b-4a5f-9ac1-890853622671" containerName="neutron-httpd" Oct 04 02:57:50 crc kubenswrapper[4964]: E1004 02:57:50.358866 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="ceilometer-notification-agent" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.358872 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="ceilometer-notification-agent" Oct 04 02:57:50 crc kubenswrapper[4964]: E1004 02:57:50.358882 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="sg-core" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.358888 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="sg-core" Oct 04 02:57:50 crc kubenswrapper[4964]: E1004 02:57:50.358899 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a1304b-a278-4813-9fa1-de46d59f5f87" containerName="mariadb-account-create" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.358905 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a1304b-a278-4813-9fa1-de46d59f5f87" containerName="mariadb-account-create" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.359047 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da7c2b0-bdcb-434a-a875-02f2d7143a32" containerName="mariadb-account-create" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.359060 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="36580143-625b-4a5f-9ac1-890853622671" containerName="neutron-api" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.359070 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a1304b-a278-4813-9fa1-de46d59f5f87" containerName="mariadb-account-create" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.359079 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="36580143-625b-4a5f-9ac1-890853622671" containerName="neutron-httpd" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.359092 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="sg-core" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.359104 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="ceilometer-notification-agent" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.359114 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="proxy-httpd" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.359120 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" containerName="ceilometer-central-agent" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.360703 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.362674 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.363088 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.363093 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.371110 4964 scope.go:117] "RemoveContainer" containerID="90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.371132 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.407889 4964 scope.go:117] "RemoveContainer" containerID="dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674" Oct 04 02:57:50 crc kubenswrapper[4964]: E1004 02:57:50.408303 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674\": container with ID starting with dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674 not found: ID does not exist" containerID="dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.408330 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674"} err="failed to get container status \"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674\": rpc error: code = NotFound desc = could not find container \"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674\": container with ID starting with dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.408351 4964 scope.go:117] "RemoveContainer" containerID="1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573" Oct 04 02:57:50 crc kubenswrapper[4964]: E1004 02:57:50.408845 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573\": container with ID starting with 1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573 not found: ID does not exist" containerID="1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.408866 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573"} err="failed to get container status \"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573\": rpc error: code = NotFound desc = could not find container \"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573\": container with ID starting with 1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.408878 4964 scope.go:117] "RemoveContainer" containerID="ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5" Oct 04 02:57:50 crc kubenswrapper[4964]: E1004 02:57:50.409118 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5\": container with ID starting with ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5 not found: ID does not exist" containerID="ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.409137 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5"} err="failed to get container status \"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5\": rpc error: code = NotFound desc = could not find container \"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5\": container with ID starting with ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.409149 4964 scope.go:117] "RemoveContainer" containerID="90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678" Oct 04 02:57:50 crc kubenswrapper[4964]: E1004 02:57:50.409321 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678\": container with ID starting with 90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678 not found: ID does not exist" containerID="90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.409339 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678"} err="failed to get container status \"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678\": rpc error: code = NotFound desc = could not find container \"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678\": container with ID starting with 90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.409351 4964 scope.go:117] "RemoveContainer" containerID="dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.409519 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674"} err="failed to get container status \"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674\": rpc error: code = NotFound desc = could not find container \"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674\": container with ID starting with dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.409537 4964 scope.go:117] "RemoveContainer" containerID="1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.410953 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573"} err="failed to get container status \"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573\": rpc error: code = NotFound desc = could not find container \"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573\": container with ID starting with 1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.410973 4964 scope.go:117] "RemoveContainer" containerID="ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.411164 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5"} err="failed to get container status \"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5\": rpc error: code = NotFound desc = could not find container \"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5\": container with ID starting with ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.411181 4964 scope.go:117] "RemoveContainer" containerID="90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.411353 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678"} err="failed to get container status \"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678\": rpc error: code = NotFound desc = could not find container \"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678\": container with ID starting with 90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.411370 4964 scope.go:117] "RemoveContainer" containerID="dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.411605 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674"} err="failed to get container status \"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674\": rpc error: code = NotFound desc = could not find container \"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674\": container with ID starting with dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.411668 4964 scope.go:117] "RemoveContainer" containerID="1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.413940 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573"} err="failed to get container status \"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573\": rpc error: code = NotFound desc = could not find container \"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573\": container with ID starting with 1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.413970 4964 scope.go:117] "RemoveContainer" containerID="ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.414164 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5"} err="failed to get container status \"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5\": rpc error: code = NotFound desc = could not find container \"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5\": container with ID starting with ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.414185 4964 scope.go:117] "RemoveContainer" containerID="90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.414360 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678"} err="failed to get container status \"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678\": rpc error: code = NotFound desc = could not find container \"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678\": container with ID starting with 90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.414377 4964 scope.go:117] "RemoveContainer" containerID="dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.414554 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674"} err="failed to get container status \"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674\": rpc error: code = NotFound desc = could not find container \"dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674\": container with ID starting with dc36d01ab911c21f750caa4d8fb17d8c6d44de0b1b7e7e6e8e2c6b66d00ca674 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.414571 4964 scope.go:117] "RemoveContainer" containerID="1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.414754 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573"} err="failed to get container status \"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573\": rpc error: code = NotFound desc = could not find container \"1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573\": container with ID starting with 1c0b6e74db0d9adc251f27167da1f293cb618e4c45e94699667b4b50b13ab573 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.414770 4964 scope.go:117] "RemoveContainer" containerID="ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.414941 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5"} err="failed to get container status \"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5\": rpc error: code = NotFound desc = could not find container \"ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5\": container with ID starting with ede83637ef631f08af22cbf272525b699d281de2e094511b4be54ca6b66154f5 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.414956 4964 scope.go:117] "RemoveContainer" containerID="90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.415116 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678"} err="failed to get container status \"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678\": rpc error: code = NotFound desc = could not find container \"90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678\": container with ID starting with 90eda0227608e33ca92c0918607601b3521f8730bffa527dafca94f6f3010678 not found: ID does not exist" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.489652 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.489693 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-scripts\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.489717 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrxh2\" (UniqueName: \"kubernetes.io/projected/6762bccb-4bd0-4864-b4b0-c9e204b47682-kube-api-access-lrxh2\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.489752 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6762bccb-4bd0-4864-b4b0-c9e204b47682-run-httpd\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.489805 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.489837 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6762bccb-4bd0-4864-b4b0-c9e204b47682-log-httpd\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.489861 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-config-data\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.489875 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.590772 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6762bccb-4bd0-4864-b4b0-c9e204b47682-log-httpd\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.590817 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-config-data\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.590835 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.590876 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.590909 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-scripts\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.590937 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrxh2\" (UniqueName: \"kubernetes.io/projected/6762bccb-4bd0-4864-b4b0-c9e204b47682-kube-api-access-lrxh2\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.590970 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6762bccb-4bd0-4864-b4b0-c9e204b47682-run-httpd\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.591024 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.591289 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6762bccb-4bd0-4864-b4b0-c9e204b47682-log-httpd\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.591335 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6762bccb-4bd0-4864-b4b0-c9e204b47682-run-httpd\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.595212 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.595817 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.599215 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.604637 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-config-data\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.612038 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-scripts\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.619819 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrxh2\" (UniqueName: \"kubernetes.io/projected/6762bccb-4bd0-4864-b4b0-c9e204b47682-kube-api-access-lrxh2\") pod \"ceilometer-0\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.687787 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.859519 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36580143-625b-4a5f-9ac1-890853622671" path="/var/lib/kubelet/pods/36580143-625b-4a5f-9ac1-890853622671/volumes" Oct 04 02:57:50 crc kubenswrapper[4964]: I1004 02:57:50.860131 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8cfa5de-9487-4a6e-8083-4cb498f21506" path="/var/lib/kubelet/pods/c8cfa5de-9487-4a6e-8083-4cb498f21506/volumes" Oct 04 02:57:51 crc kubenswrapper[4964]: I1004 02:57:51.169037 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:51 crc kubenswrapper[4964]: W1004 02:57:51.175899 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6762bccb_4bd0_4864_b4b0_c9e204b47682.slice/crio-95afe6d5cdde72544a9dffc091e7e4250e8fb832dd7e5dfabb6901904ffe6548 WatchSource:0}: Error finding container 95afe6d5cdde72544a9dffc091e7e4250e8fb832dd7e5dfabb6901904ffe6548: Status 404 returned error can't find the container with id 95afe6d5cdde72544a9dffc091e7e4250e8fb832dd7e5dfabb6901904ffe6548 Oct 04 02:57:51 crc kubenswrapper[4964]: I1004 02:57:51.306860 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6762bccb-4bd0-4864-b4b0-c9e204b47682","Type":"ContainerStarted","Data":"95afe6d5cdde72544a9dffc091e7e4250e8fb832dd7e5dfabb6901904ffe6548"} Oct 04 02:57:52 crc kubenswrapper[4964]: I1004 02:57:52.326946 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6762bccb-4bd0-4864-b4b0-c9e204b47682","Type":"ContainerStarted","Data":"12ec6c70b17f0932b549dffc59769daaa72f7c7ca6dba6e44c3d7c4d0f843775"} Oct 04 02:57:53 crc kubenswrapper[4964]: I1004 02:57:53.337413 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6762bccb-4bd0-4864-b4b0-c9e204b47682","Type":"ContainerStarted","Data":"196a3f7517f114786bc077ac31533ab68c2bcd96b556cea5b03b39e42f41b1f4"} Oct 04 02:57:53 crc kubenswrapper[4964]: I1004 02:57:53.337741 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6762bccb-4bd0-4864-b4b0-c9e204b47682","Type":"ContainerStarted","Data":"4b4646dd90fa332bd47c4e664352337638062b5a5db25c149adca24c6b436c07"} Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.453054 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-21c7-account-create-gpnbv"] Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.454294 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-21c7-account-create-gpnbv" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.456544 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.470161 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-21c7-account-create-gpnbv"] Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.560515 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6d2t\" (UniqueName: \"kubernetes.io/projected/73278f79-5723-43d3-8038-0dc5479356cb-kube-api-access-l6d2t\") pod \"nova-api-21c7-account-create-gpnbv\" (UID: \"73278f79-5723-43d3-8038-0dc5479356cb\") " pod="openstack/nova-api-21c7-account-create-gpnbv" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.662216 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6d2t\" (UniqueName: \"kubernetes.io/projected/73278f79-5723-43d3-8038-0dc5479356cb-kube-api-access-l6d2t\") pod \"nova-api-21c7-account-create-gpnbv\" (UID: \"73278f79-5723-43d3-8038-0dc5479356cb\") " pod="openstack/nova-api-21c7-account-create-gpnbv" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.707774 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6d2t\" (UniqueName: \"kubernetes.io/projected/73278f79-5723-43d3-8038-0dc5479356cb-kube-api-access-l6d2t\") pod \"nova-api-21c7-account-create-gpnbv\" (UID: \"73278f79-5723-43d3-8038-0dc5479356cb\") " pod="openstack/nova-api-21c7-account-create-gpnbv" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.797688 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-21c7-account-create-gpnbv" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.902394 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5kmn5"] Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.903427 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.906923 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-d8w9q" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.906927 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.907091 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.937667 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5kmn5"] Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.968683 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-scripts\") pod \"nova-cell0-conductor-db-sync-5kmn5\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.968734 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-config-data\") pod \"nova-cell0-conductor-db-sync-5kmn5\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.969773 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljhm5\" (UniqueName: \"kubernetes.io/projected/19923833-9456-4f37-b45f-95d76e8b8483-kube-api-access-ljhm5\") pod \"nova-cell0-conductor-db-sync-5kmn5\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:54 crc kubenswrapper[4964]: I1004 02:57:54.969876 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5kmn5\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.071636 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljhm5\" (UniqueName: \"kubernetes.io/projected/19923833-9456-4f37-b45f-95d76e8b8483-kube-api-access-ljhm5\") pod \"nova-cell0-conductor-db-sync-5kmn5\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.071912 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5kmn5\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.071980 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-scripts\") pod \"nova-cell0-conductor-db-sync-5kmn5\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.072003 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-config-data\") pod \"nova-cell0-conductor-db-sync-5kmn5\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.075669 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-config-data\") pod \"nova-cell0-conductor-db-sync-5kmn5\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.078112 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-scripts\") pod \"nova-cell0-conductor-db-sync-5kmn5\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.081082 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5kmn5\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.104395 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljhm5\" (UniqueName: \"kubernetes.io/projected/19923833-9456-4f37-b45f-95d76e8b8483-kube-api-access-ljhm5\") pod \"nova-cell0-conductor-db-sync-5kmn5\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.228876 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.381808 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6762bccb-4bd0-4864-b4b0-c9e204b47682","Type":"ContainerStarted","Data":"5350210f1dff8c84d9d6711328ea055ac07f24f688b6ab8c162be6c9e246c753"} Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.382067 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.383575 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-21c7-account-create-gpnbv"] Oct 04 02:57:55 crc kubenswrapper[4964]: W1004 02:57:55.395307 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73278f79_5723_43d3_8038_0dc5479356cb.slice/crio-28d2ae0c66241d68288088a0317a90c3535fce62c955dafb49b8410deaea2916 WatchSource:0}: Error finding container 28d2ae0c66241d68288088a0317a90c3535fce62c955dafb49b8410deaea2916: Status 404 returned error can't find the container with id 28d2ae0c66241d68288088a0317a90c3535fce62c955dafb49b8410deaea2916 Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.428739 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.300053755 podStartE2EDuration="5.428721068s" podCreationTimestamp="2025-10-04 02:57:50 +0000 UTC" firstStartedPulling="2025-10-04 02:57:51.179319377 +0000 UTC m=+1051.076278015" lastFinishedPulling="2025-10-04 02:57:54.30798668 +0000 UTC m=+1054.204945328" observedRunningTime="2025-10-04 02:57:55.417402045 +0000 UTC m=+1055.314360693" watchObservedRunningTime="2025-10-04 02:57:55.428721068 +0000 UTC m=+1055.325679716" Oct 04 02:57:55 crc kubenswrapper[4964]: I1004 02:57:55.707846 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5kmn5"] Oct 04 02:57:55 crc kubenswrapper[4964]: W1004 02:57:55.715872 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19923833_9456_4f37_b45f_95d76e8b8483.slice/crio-1bcc4dbc3560b461fceaf4bed80874ba2093a468dcc5d1471afdb1d4a58cad3a WatchSource:0}: Error finding container 1bcc4dbc3560b461fceaf4bed80874ba2093a468dcc5d1471afdb1d4a58cad3a: Status 404 returned error can't find the container with id 1bcc4dbc3560b461fceaf4bed80874ba2093a468dcc5d1471afdb1d4a58cad3a Oct 04 02:57:56 crc kubenswrapper[4964]: I1004 02:57:56.393101 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5kmn5" event={"ID":"19923833-9456-4f37-b45f-95d76e8b8483","Type":"ContainerStarted","Data":"1bcc4dbc3560b461fceaf4bed80874ba2093a468dcc5d1471afdb1d4a58cad3a"} Oct 04 02:57:56 crc kubenswrapper[4964]: I1004 02:57:56.395643 4964 generic.go:334] "Generic (PLEG): container finished" podID="73278f79-5723-43d3-8038-0dc5479356cb" containerID="5c68b018a8b5c0240dfac8ba2cbee259480f24f37a90864ad11bf69a4e32ab8f" exitCode=0 Oct 04 02:57:56 crc kubenswrapper[4964]: I1004 02:57:56.396559 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-21c7-account-create-gpnbv" event={"ID":"73278f79-5723-43d3-8038-0dc5479356cb","Type":"ContainerDied","Data":"5c68b018a8b5c0240dfac8ba2cbee259480f24f37a90864ad11bf69a4e32ab8f"} Oct 04 02:57:56 crc kubenswrapper[4964]: I1004 02:57:56.396591 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-21c7-account-create-gpnbv" event={"ID":"73278f79-5723-43d3-8038-0dc5479356cb","Type":"ContainerStarted","Data":"28d2ae0c66241d68288088a0317a90c3535fce62c955dafb49b8410deaea2916"} Oct 04 02:57:57 crc kubenswrapper[4964]: I1004 02:57:57.789070 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-21c7-account-create-gpnbv" Oct 04 02:57:57 crc kubenswrapper[4964]: I1004 02:57:57.938530 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6d2t\" (UniqueName: \"kubernetes.io/projected/73278f79-5723-43d3-8038-0dc5479356cb-kube-api-access-l6d2t\") pod \"73278f79-5723-43d3-8038-0dc5479356cb\" (UID: \"73278f79-5723-43d3-8038-0dc5479356cb\") " Oct 04 02:57:57 crc kubenswrapper[4964]: I1004 02:57:57.945813 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73278f79-5723-43d3-8038-0dc5479356cb-kube-api-access-l6d2t" (OuterVolumeSpecName: "kube-api-access-l6d2t") pod "73278f79-5723-43d3-8038-0dc5479356cb" (UID: "73278f79-5723-43d3-8038-0dc5479356cb"). InnerVolumeSpecName "kube-api-access-l6d2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:57:58 crc kubenswrapper[4964]: I1004 02:57:58.041378 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6d2t\" (UniqueName: \"kubernetes.io/projected/73278f79-5723-43d3-8038-0dc5479356cb-kube-api-access-l6d2t\") on node \"crc\" DevicePath \"\"" Oct 04 02:57:58 crc kubenswrapper[4964]: I1004 02:57:58.416244 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-21c7-account-create-gpnbv" event={"ID":"73278f79-5723-43d3-8038-0dc5479356cb","Type":"ContainerDied","Data":"28d2ae0c66241d68288088a0317a90c3535fce62c955dafb49b8410deaea2916"} Oct 04 02:57:58 crc kubenswrapper[4964]: I1004 02:57:58.416495 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28d2ae0c66241d68288088a0317a90c3535fce62c955dafb49b8410deaea2916" Oct 04 02:57:58 crc kubenswrapper[4964]: I1004 02:57:58.416542 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-21c7-account-create-gpnbv" Oct 04 02:57:59 crc kubenswrapper[4964]: I1004 02:57:59.860338 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:57:59 crc kubenswrapper[4964]: I1004 02:57:59.860638 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="ceilometer-central-agent" containerID="cri-o://12ec6c70b17f0932b549dffc59769daaa72f7c7ca6dba6e44c3d7c4d0f843775" gracePeriod=30 Oct 04 02:57:59 crc kubenswrapper[4964]: I1004 02:57:59.860686 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="sg-core" containerID="cri-o://196a3f7517f114786bc077ac31533ab68c2bcd96b556cea5b03b39e42f41b1f4" gracePeriod=30 Oct 04 02:57:59 crc kubenswrapper[4964]: I1004 02:57:59.860684 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="proxy-httpd" containerID="cri-o://5350210f1dff8c84d9d6711328ea055ac07f24f688b6ab8c162be6c9e246c753" gracePeriod=30 Oct 04 02:57:59 crc kubenswrapper[4964]: I1004 02:57:59.860702 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="ceilometer-notification-agent" containerID="cri-o://4b4646dd90fa332bd47c4e664352337638062b5a5db25c149adca24c6b436c07" gracePeriod=30 Oct 04 02:58:00 crc kubenswrapper[4964]: I1004 02:58:00.432802 4964 generic.go:334] "Generic (PLEG): container finished" podID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerID="5350210f1dff8c84d9d6711328ea055ac07f24f688b6ab8c162be6c9e246c753" exitCode=0 Oct 04 02:58:00 crc kubenswrapper[4964]: I1004 02:58:00.433108 4964 generic.go:334] "Generic (PLEG): container finished" podID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerID="196a3f7517f114786bc077ac31533ab68c2bcd96b556cea5b03b39e42f41b1f4" exitCode=2 Oct 04 02:58:00 crc kubenswrapper[4964]: I1004 02:58:00.433117 4964 generic.go:334] "Generic (PLEG): container finished" podID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerID="4b4646dd90fa332bd47c4e664352337638062b5a5db25c149adca24c6b436c07" exitCode=0 Oct 04 02:58:00 crc kubenswrapper[4964]: I1004 02:58:00.433125 4964 generic.go:334] "Generic (PLEG): container finished" podID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerID="12ec6c70b17f0932b549dffc59769daaa72f7c7ca6dba6e44c3d7c4d0f843775" exitCode=0 Oct 04 02:58:00 crc kubenswrapper[4964]: I1004 02:58:00.432879 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6762bccb-4bd0-4864-b4b0-c9e204b47682","Type":"ContainerDied","Data":"5350210f1dff8c84d9d6711328ea055ac07f24f688b6ab8c162be6c9e246c753"} Oct 04 02:58:00 crc kubenswrapper[4964]: I1004 02:58:00.433158 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6762bccb-4bd0-4864-b4b0-c9e204b47682","Type":"ContainerDied","Data":"196a3f7517f114786bc077ac31533ab68c2bcd96b556cea5b03b39e42f41b1f4"} Oct 04 02:58:00 crc kubenswrapper[4964]: I1004 02:58:00.433171 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6762bccb-4bd0-4864-b4b0-c9e204b47682","Type":"ContainerDied","Data":"4b4646dd90fa332bd47c4e664352337638062b5a5db25c149adca24c6b436c07"} Oct 04 02:58:00 crc kubenswrapper[4964]: I1004 02:58:00.433213 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6762bccb-4bd0-4864-b4b0-c9e204b47682","Type":"ContainerDied","Data":"12ec6c70b17f0932b549dffc59769daaa72f7c7ca6dba6e44c3d7c4d0f843775"} Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.846766 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.949788 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrxh2\" (UniqueName: \"kubernetes.io/projected/6762bccb-4bd0-4864-b4b0-c9e204b47682-kube-api-access-lrxh2\") pod \"6762bccb-4bd0-4864-b4b0-c9e204b47682\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.950008 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-ceilometer-tls-certs\") pod \"6762bccb-4bd0-4864-b4b0-c9e204b47682\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.950125 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-scripts\") pod \"6762bccb-4bd0-4864-b4b0-c9e204b47682\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.950224 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6762bccb-4bd0-4864-b4b0-c9e204b47682-run-httpd\") pod \"6762bccb-4bd0-4864-b4b0-c9e204b47682\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.950308 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-sg-core-conf-yaml\") pod \"6762bccb-4bd0-4864-b4b0-c9e204b47682\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.950392 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-combined-ca-bundle\") pod \"6762bccb-4bd0-4864-b4b0-c9e204b47682\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.950585 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6762bccb-4bd0-4864-b4b0-c9e204b47682-log-httpd\") pod \"6762bccb-4bd0-4864-b4b0-c9e204b47682\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.950663 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6762bccb-4bd0-4864-b4b0-c9e204b47682-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6762bccb-4bd0-4864-b4b0-c9e204b47682" (UID: "6762bccb-4bd0-4864-b4b0-c9e204b47682"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.950680 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-config-data\") pod \"6762bccb-4bd0-4864-b4b0-c9e204b47682\" (UID: \"6762bccb-4bd0-4864-b4b0-c9e204b47682\") " Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.951053 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6762bccb-4bd0-4864-b4b0-c9e204b47682-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6762bccb-4bd0-4864-b4b0-c9e204b47682" (UID: "6762bccb-4bd0-4864-b4b0-c9e204b47682"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.951755 4964 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6762bccb-4bd0-4864-b4b0-c9e204b47682-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.951783 4964 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6762bccb-4bd0-4864-b4b0-c9e204b47682-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.953537 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6762bccb-4bd0-4864-b4b0-c9e204b47682-kube-api-access-lrxh2" (OuterVolumeSpecName: "kube-api-access-lrxh2") pod "6762bccb-4bd0-4864-b4b0-c9e204b47682" (UID: "6762bccb-4bd0-4864-b4b0-c9e204b47682"). InnerVolumeSpecName "kube-api-access-lrxh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.954171 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-scripts" (OuterVolumeSpecName: "scripts") pod "6762bccb-4bd0-4864-b4b0-c9e204b47682" (UID: "6762bccb-4bd0-4864-b4b0-c9e204b47682"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:02 crc kubenswrapper[4964]: I1004 02:58:02.971544 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6762bccb-4bd0-4864-b4b0-c9e204b47682" (UID: "6762bccb-4bd0-4864-b4b0-c9e204b47682"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.032739 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6762bccb-4bd0-4864-b4b0-c9e204b47682" (UID: "6762bccb-4bd0-4864-b4b0-c9e204b47682"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.038272 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6762bccb-4bd0-4864-b4b0-c9e204b47682" (UID: "6762bccb-4bd0-4864-b4b0-c9e204b47682"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.053261 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrxh2\" (UniqueName: \"kubernetes.io/projected/6762bccb-4bd0-4864-b4b0-c9e204b47682-kube-api-access-lrxh2\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.053293 4964 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.053334 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.053432 4964 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.053444 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.070970 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-config-data" (OuterVolumeSpecName: "config-data") pod "6762bccb-4bd0-4864-b4b0-c9e204b47682" (UID: "6762bccb-4bd0-4864-b4b0-c9e204b47682"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.154770 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6762bccb-4bd0-4864-b4b0-c9e204b47682-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.478371 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5kmn5" event={"ID":"19923833-9456-4f37-b45f-95d76e8b8483","Type":"ContainerStarted","Data":"58002698ab53561d18086c5eeb1c4a6e7348d28d57a523ce7015d8ed9cb13239"} Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.482694 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6762bccb-4bd0-4864-b4b0-c9e204b47682","Type":"ContainerDied","Data":"95afe6d5cdde72544a9dffc091e7e4250e8fb832dd7e5dfabb6901904ffe6548"} Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.483044 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.483721 4964 scope.go:117] "RemoveContainer" containerID="5350210f1dff8c84d9d6711328ea055ac07f24f688b6ab8c162be6c9e246c753" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.517741 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5kmn5" podStartSLOduration=2.498021602 podStartE2EDuration="9.517704376s" podCreationTimestamp="2025-10-04 02:57:54 +0000 UTC" firstStartedPulling="2025-10-04 02:57:55.718299158 +0000 UTC m=+1055.615257806" lastFinishedPulling="2025-10-04 02:58:02.737981932 +0000 UTC m=+1062.634940580" observedRunningTime="2025-10-04 02:58:03.511698535 +0000 UTC m=+1063.408657213" watchObservedRunningTime="2025-10-04 02:58:03.517704376 +0000 UTC m=+1063.414663084" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.539287 4964 scope.go:117] "RemoveContainer" containerID="196a3f7517f114786bc077ac31533ab68c2bcd96b556cea5b03b39e42f41b1f4" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.548401 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.556006 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.567293 4964 scope.go:117] "RemoveContainer" containerID="4b4646dd90fa332bd47c4e664352337638062b5a5db25c149adca24c6b436c07" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.584812 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:58:03 crc kubenswrapper[4964]: E1004 02:58:03.585215 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="ceilometer-notification-agent" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.585259 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="ceilometer-notification-agent" Oct 04 02:58:03 crc kubenswrapper[4964]: E1004 02:58:03.585272 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="proxy-httpd" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.585278 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="proxy-httpd" Oct 04 02:58:03 crc kubenswrapper[4964]: E1004 02:58:03.585298 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73278f79-5723-43d3-8038-0dc5479356cb" containerName="mariadb-account-create" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.585304 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="73278f79-5723-43d3-8038-0dc5479356cb" containerName="mariadb-account-create" Oct 04 02:58:03 crc kubenswrapper[4964]: E1004 02:58:03.585312 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="sg-core" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.585317 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="sg-core" Oct 04 02:58:03 crc kubenswrapper[4964]: E1004 02:58:03.585332 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="ceilometer-central-agent" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.585338 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="ceilometer-central-agent" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.585488 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="proxy-httpd" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.585504 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="ceilometer-notification-agent" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.585519 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="73278f79-5723-43d3-8038-0dc5479356cb" containerName="mariadb-account-create" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.585529 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="ceilometer-central-agent" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.585587 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" containerName="sg-core" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.587063 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.591661 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.591662 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.591879 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.605236 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.610702 4964 scope.go:117] "RemoveContainer" containerID="12ec6c70b17f0932b549dffc59769daaa72f7c7ca6dba6e44c3d7c4d0f843775" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.669895 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-scripts\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.669949 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.670028 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.670059 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb1307c-b75c-4636-a08b-b6be0eec41cd-run-httpd\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.670094 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-config-data\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.670168 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb1307c-b75c-4636-a08b-b6be0eec41cd-log-httpd\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.670197 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p556c\" (UniqueName: \"kubernetes.io/projected/2bb1307c-b75c-4636-a08b-b6be0eec41cd-kube-api-access-p556c\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.670277 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.771696 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-scripts\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.771754 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.771798 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.771816 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb1307c-b75c-4636-a08b-b6be0eec41cd-run-httpd\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.771833 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-config-data\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.771878 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb1307c-b75c-4636-a08b-b6be0eec41cd-log-httpd\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.771896 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p556c\" (UniqueName: \"kubernetes.io/projected/2bb1307c-b75c-4636-a08b-b6be0eec41cd-kube-api-access-p556c\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.771944 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.773098 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb1307c-b75c-4636-a08b-b6be0eec41cd-log-httpd\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.773148 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb1307c-b75c-4636-a08b-b6be0eec41cd-run-httpd\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.778866 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-config-data\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.785251 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-scripts\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.785741 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.786217 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.786697 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.788056 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p556c\" (UniqueName: \"kubernetes.io/projected/2bb1307c-b75c-4636-a08b-b6be0eec41cd-kube-api-access-p556c\") pod \"ceilometer-0\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " pod="openstack/ceilometer-0" Oct 04 02:58:03 crc kubenswrapper[4964]: I1004 02:58:03.907286 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:58:04 crc kubenswrapper[4964]: W1004 02:58:04.173196 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bb1307c_b75c_4636_a08b_b6be0eec41cd.slice/crio-6ba14fb5358d84eb29866e2f70948b88f51986cbbc631fc40e8f34eae44f666e WatchSource:0}: Error finding container 6ba14fb5358d84eb29866e2f70948b88f51986cbbc631fc40e8f34eae44f666e: Status 404 returned error can't find the container with id 6ba14fb5358d84eb29866e2f70948b88f51986cbbc631fc40e8f34eae44f666e Oct 04 02:58:04 crc kubenswrapper[4964]: I1004 02:58:04.174899 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:58:04 crc kubenswrapper[4964]: I1004 02:58:04.495259 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb1307c-b75c-4636-a08b-b6be0eec41cd","Type":"ContainerStarted","Data":"6ba14fb5358d84eb29866e2f70948b88f51986cbbc631fc40e8f34eae44f666e"} Oct 04 02:58:04 crc kubenswrapper[4964]: I1004 02:58:04.866766 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6762bccb-4bd0-4864-b4b0-c9e204b47682" path="/var/lib/kubelet/pods/6762bccb-4bd0-4864-b4b0-c9e204b47682/volumes" Oct 04 02:58:06 crc kubenswrapper[4964]: I1004 02:58:06.519602 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb1307c-b75c-4636-a08b-b6be0eec41cd","Type":"ContainerStarted","Data":"2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23"} Oct 04 02:58:07 crc kubenswrapper[4964]: I1004 02:58:07.535564 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb1307c-b75c-4636-a08b-b6be0eec41cd","Type":"ContainerStarted","Data":"fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c"} Oct 04 02:58:08 crc kubenswrapper[4964]: I1004 02:58:08.565811 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb1307c-b75c-4636-a08b-b6be0eec41cd","Type":"ContainerStarted","Data":"423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424"} Oct 04 02:58:09 crc kubenswrapper[4964]: I1004 02:58:09.585475 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb1307c-b75c-4636-a08b-b6be0eec41cd","Type":"ContainerStarted","Data":"13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275"} Oct 04 02:58:09 crc kubenswrapper[4964]: I1004 02:58:09.585949 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 02:58:12 crc kubenswrapper[4964]: I1004 02:58:12.618570 4964 generic.go:334] "Generic (PLEG): container finished" podID="19923833-9456-4f37-b45f-95d76e8b8483" containerID="58002698ab53561d18086c5eeb1c4a6e7348d28d57a523ce7015d8ed9cb13239" exitCode=0 Oct 04 02:58:12 crc kubenswrapper[4964]: I1004 02:58:12.618683 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5kmn5" event={"ID":"19923833-9456-4f37-b45f-95d76e8b8483","Type":"ContainerDied","Data":"58002698ab53561d18086c5eeb1c4a6e7348d28d57a523ce7015d8ed9cb13239"} Oct 04 02:58:12 crc kubenswrapper[4964]: I1004 02:58:12.648767 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.186498328 podStartE2EDuration="9.648742459s" podCreationTimestamp="2025-10-04 02:58:03 +0000 UTC" firstStartedPulling="2025-10-04 02:58:04.175607192 +0000 UTC m=+1064.072565860" lastFinishedPulling="2025-10-04 02:58:08.637851343 +0000 UTC m=+1068.534809991" observedRunningTime="2025-10-04 02:58:09.625973447 +0000 UTC m=+1069.522932115" watchObservedRunningTime="2025-10-04 02:58:12.648742459 +0000 UTC m=+1072.545701127" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.027902 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.086396 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-config-data\") pod \"19923833-9456-4f37-b45f-95d76e8b8483\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.086517 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-combined-ca-bundle\") pod \"19923833-9456-4f37-b45f-95d76e8b8483\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.086570 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljhm5\" (UniqueName: \"kubernetes.io/projected/19923833-9456-4f37-b45f-95d76e8b8483-kube-api-access-ljhm5\") pod \"19923833-9456-4f37-b45f-95d76e8b8483\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.086781 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-scripts\") pod \"19923833-9456-4f37-b45f-95d76e8b8483\" (UID: \"19923833-9456-4f37-b45f-95d76e8b8483\") " Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.092075 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-scripts" (OuterVolumeSpecName: "scripts") pod "19923833-9456-4f37-b45f-95d76e8b8483" (UID: "19923833-9456-4f37-b45f-95d76e8b8483"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.092092 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19923833-9456-4f37-b45f-95d76e8b8483-kube-api-access-ljhm5" (OuterVolumeSpecName: "kube-api-access-ljhm5") pod "19923833-9456-4f37-b45f-95d76e8b8483" (UID: "19923833-9456-4f37-b45f-95d76e8b8483"). InnerVolumeSpecName "kube-api-access-ljhm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.110656 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19923833-9456-4f37-b45f-95d76e8b8483" (UID: "19923833-9456-4f37-b45f-95d76e8b8483"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.129386 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-config-data" (OuterVolumeSpecName: "config-data") pod "19923833-9456-4f37-b45f-95d76e8b8483" (UID: "19923833-9456-4f37-b45f-95d76e8b8483"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.189232 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.189271 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.189288 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljhm5\" (UniqueName: \"kubernetes.io/projected/19923833-9456-4f37-b45f-95d76e8b8483-kube-api-access-ljhm5\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.189299 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19923833-9456-4f37-b45f-95d76e8b8483-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.639547 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5kmn5" event={"ID":"19923833-9456-4f37-b45f-95d76e8b8483","Type":"ContainerDied","Data":"1bcc4dbc3560b461fceaf4bed80874ba2093a468dcc5d1471afdb1d4a58cad3a"} Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.639606 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bcc4dbc3560b461fceaf4bed80874ba2093a468dcc5d1471afdb1d4a58cad3a" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.639992 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5kmn5" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.867180 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 04 02:58:14 crc kubenswrapper[4964]: E1004 02:58:14.868019 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19923833-9456-4f37-b45f-95d76e8b8483" containerName="nova-cell0-conductor-db-sync" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.868136 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="19923833-9456-4f37-b45f-95d76e8b8483" containerName="nova-cell0-conductor-db-sync" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.868425 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="19923833-9456-4f37-b45f-95d76e8b8483" containerName="nova-cell0-conductor-db-sync" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.869147 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.873836 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.875896 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-d8w9q" Oct 04 02:58:14 crc kubenswrapper[4964]: I1004 02:58:14.877373 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 04 02:58:15 crc kubenswrapper[4964]: I1004 02:58:15.002467 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1282768-6281-40a2-aaec-02d55f52579d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b1282768-6281-40a2-aaec-02d55f52579d\") " pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:15 crc kubenswrapper[4964]: I1004 02:58:15.002889 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1282768-6281-40a2-aaec-02d55f52579d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b1282768-6281-40a2-aaec-02d55f52579d\") " pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:15 crc kubenswrapper[4964]: I1004 02:58:15.003096 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7fhx\" (UniqueName: \"kubernetes.io/projected/b1282768-6281-40a2-aaec-02d55f52579d-kube-api-access-b7fhx\") pod \"nova-cell0-conductor-0\" (UID: \"b1282768-6281-40a2-aaec-02d55f52579d\") " pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:15 crc kubenswrapper[4964]: I1004 02:58:15.104862 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7fhx\" (UniqueName: \"kubernetes.io/projected/b1282768-6281-40a2-aaec-02d55f52579d-kube-api-access-b7fhx\") pod \"nova-cell0-conductor-0\" (UID: \"b1282768-6281-40a2-aaec-02d55f52579d\") " pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:15 crc kubenswrapper[4964]: I1004 02:58:15.104983 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1282768-6281-40a2-aaec-02d55f52579d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b1282768-6281-40a2-aaec-02d55f52579d\") " pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:15 crc kubenswrapper[4964]: I1004 02:58:15.105099 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1282768-6281-40a2-aaec-02d55f52579d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b1282768-6281-40a2-aaec-02d55f52579d\") " pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:15 crc kubenswrapper[4964]: I1004 02:58:15.110334 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1282768-6281-40a2-aaec-02d55f52579d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b1282768-6281-40a2-aaec-02d55f52579d\") " pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:15 crc kubenswrapper[4964]: I1004 02:58:15.113569 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1282768-6281-40a2-aaec-02d55f52579d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b1282768-6281-40a2-aaec-02d55f52579d\") " pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:15 crc kubenswrapper[4964]: I1004 02:58:15.138164 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7fhx\" (UniqueName: \"kubernetes.io/projected/b1282768-6281-40a2-aaec-02d55f52579d-kube-api-access-b7fhx\") pod \"nova-cell0-conductor-0\" (UID: \"b1282768-6281-40a2-aaec-02d55f52579d\") " pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:15 crc kubenswrapper[4964]: I1004 02:58:15.187944 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:15 crc kubenswrapper[4964]: I1004 02:58:15.679226 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 04 02:58:16 crc kubenswrapper[4964]: I1004 02:58:16.673008 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b1282768-6281-40a2-aaec-02d55f52579d","Type":"ContainerStarted","Data":"8415f5607462f41daa4fe1d0f8fda5a394ea80173aa614425d778b29a5080416"} Oct 04 02:58:16 crc kubenswrapper[4964]: I1004 02:58:16.673548 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b1282768-6281-40a2-aaec-02d55f52579d","Type":"ContainerStarted","Data":"bf3591810334bc399ce4473706cf77a864d292651866f938b544992da9500bdf"} Oct 04 02:58:16 crc kubenswrapper[4964]: I1004 02:58:16.676247 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:20 crc kubenswrapper[4964]: I1004 02:58:20.237112 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 04 02:58:20 crc kubenswrapper[4964]: I1004 02:58:20.266382 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=6.266358526 podStartE2EDuration="6.266358526s" podCreationTimestamp="2025-10-04 02:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:58:16.698402251 +0000 UTC m=+1076.595360929" watchObservedRunningTime="2025-10-04 02:58:20.266358526 +0000 UTC m=+1080.163317204" Oct 04 02:58:20 crc kubenswrapper[4964]: I1004 02:58:20.832540 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-ch92k"] Oct 04 02:58:20 crc kubenswrapper[4964]: I1004 02:58:20.833953 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:20 crc kubenswrapper[4964]: I1004 02:58:20.837325 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 04 02:58:20 crc kubenswrapper[4964]: I1004 02:58:20.837512 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 04 02:58:20 crc kubenswrapper[4964]: I1004 02:58:20.842471 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ch92k"] Oct 04 02:58:20 crc kubenswrapper[4964]: I1004 02:58:20.921073 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b86fv\" (UniqueName: \"kubernetes.io/projected/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-kube-api-access-b86fv\") pod \"nova-cell0-cell-mapping-ch92k\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:20 crc kubenswrapper[4964]: I1004 02:58:20.921244 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ch92k\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:20 crc kubenswrapper[4964]: I1004 02:58:20.921483 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-config-data\") pod \"nova-cell0-cell-mapping-ch92k\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:20 crc kubenswrapper[4964]: I1004 02:58:20.921589 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-scripts\") pod \"nova-cell0-cell-mapping-ch92k\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.023481 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-scripts\") pod \"nova-cell0-cell-mapping-ch92k\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.023548 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b86fv\" (UniqueName: \"kubernetes.io/projected/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-kube-api-access-b86fv\") pod \"nova-cell0-cell-mapping-ch92k\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.023591 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ch92k\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.023754 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-config-data\") pod \"nova-cell0-cell-mapping-ch92k\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.029412 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-scripts\") pod \"nova-cell0-cell-mapping-ch92k\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.029850 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-config-data\") pod \"nova-cell0-cell-mapping-ch92k\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.030960 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ch92k\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.060250 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.061724 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.064334 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.076437 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.077828 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.080215 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.084650 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.097322 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.112633 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b86fv\" (UniqueName: \"kubernetes.io/projected/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-kube-api-access-b86fv\") pod \"nova-cell0-cell-mapping-ch92k\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.125112 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6zr6\" (UniqueName: \"kubernetes.io/projected/c8f3c888-5938-4862-9f0c-3b7b86566f84-kube-api-access-g6zr6\") pod \"nova-scheduler-0\" (UID: \"c8f3c888-5938-4862-9f0c-3b7b86566f84\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.125180 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f3c888-5938-4862-9f0c-3b7b86566f84-config-data\") pod \"nova-scheduler-0\" (UID: \"c8f3c888-5938-4862-9f0c-3b7b86566f84\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.125245 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/145964bf-f38c-4a86-8939-6934e9fd3d1b-logs\") pod \"nova-api-0\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.125342 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f3c888-5938-4862-9f0c-3b7b86566f84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c8f3c888-5938-4862-9f0c-3b7b86566f84\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.125428 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145964bf-f38c-4a86-8939-6934e9fd3d1b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.125513 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145964bf-f38c-4a86-8939-6934e9fd3d1b-config-data\") pod \"nova-api-0\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.125662 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ws2f\" (UniqueName: \"kubernetes.io/projected/145964bf-f38c-4a86-8939-6934e9fd3d1b-kube-api-access-4ws2f\") pod \"nova-api-0\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.165184 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.230535 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f3c888-5938-4862-9f0c-3b7b86566f84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c8f3c888-5938-4862-9f0c-3b7b86566f84\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.230590 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145964bf-f38c-4a86-8939-6934e9fd3d1b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.230627 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145964bf-f38c-4a86-8939-6934e9fd3d1b-config-data\") pod \"nova-api-0\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.230686 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ws2f\" (UniqueName: \"kubernetes.io/projected/145964bf-f38c-4a86-8939-6934e9fd3d1b-kube-api-access-4ws2f\") pod \"nova-api-0\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.230727 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6zr6\" (UniqueName: \"kubernetes.io/projected/c8f3c888-5938-4862-9f0c-3b7b86566f84-kube-api-access-g6zr6\") pod \"nova-scheduler-0\" (UID: \"c8f3c888-5938-4862-9f0c-3b7b86566f84\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.230748 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f3c888-5938-4862-9f0c-3b7b86566f84-config-data\") pod \"nova-scheduler-0\" (UID: \"c8f3c888-5938-4862-9f0c-3b7b86566f84\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.230795 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/145964bf-f38c-4a86-8939-6934e9fd3d1b-logs\") pod \"nova-api-0\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.231347 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/145964bf-f38c-4a86-8939-6934e9fd3d1b-logs\") pod \"nova-api-0\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.231452 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.232485 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.233805 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f3c888-5938-4862-9f0c-3b7b86566f84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c8f3c888-5938-4862-9f0c-3b7b86566f84\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.237105 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.238284 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145964bf-f38c-4a86-8939-6934e9fd3d1b-config-data\") pod \"nova-api-0\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.253198 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145964bf-f38c-4a86-8939-6934e9fd3d1b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.262242 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f3c888-5938-4862-9f0c-3b7b86566f84-config-data\") pod \"nova-scheduler-0\" (UID: \"c8f3c888-5938-4862-9f0c-3b7b86566f84\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.264061 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6zr6\" (UniqueName: \"kubernetes.io/projected/c8f3c888-5938-4862-9f0c-3b7b86566f84-kube-api-access-g6zr6\") pod \"nova-scheduler-0\" (UID: \"c8f3c888-5938-4862-9f0c-3b7b86566f84\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.270668 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ws2f\" (UniqueName: \"kubernetes.io/projected/145964bf-f38c-4a86-8939-6934e9fd3d1b-kube-api-access-4ws2f\") pod \"nova-api-0\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.283691 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.285052 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.294681 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.294884 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.321192 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.328089 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.332929 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-config-data\") pod \"nova-metadata-0\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.333164 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ghsg\" (UniqueName: \"kubernetes.io/projected/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-kube-api-access-6ghsg\") pod \"nova-metadata-0\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.333255 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.333307 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpg4r\" (UniqueName: \"kubernetes.io/projected/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-kube-api-access-zpg4r\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.333333 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-logs\") pod \"nova-metadata-0\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.333350 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.333387 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.368987 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-rvg6j"] Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.393695 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.423945 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-rvg6j"] Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.434462 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.434542 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-config-data\") pod \"nova-metadata-0\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.434565 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ghsg\" (UniqueName: \"kubernetes.io/projected/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-kube-api-access-6ghsg\") pod \"nova-metadata-0\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.434616 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-dns-svc\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.434668 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.434697 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-config\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.434715 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.434737 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55vkg\" (UniqueName: \"kubernetes.io/projected/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-kube-api-access-55vkg\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.434792 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpg4r\" (UniqueName: \"kubernetes.io/projected/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-kube-api-access-zpg4r\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.434826 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-logs\") pod \"nova-metadata-0\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.434842 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.434860 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.436305 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-logs\") pod \"nova-metadata-0\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.439659 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.440121 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.441348 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-config-data\") pod \"nova-metadata-0\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.444809 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.455068 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpg4r\" (UniqueName: \"kubernetes.io/projected/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-kube-api-access-zpg4r\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.456449 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ghsg\" (UniqueName: \"kubernetes.io/projected/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-kube-api-access-6ghsg\") pod \"nova-metadata-0\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.484300 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.538268 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.538336 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-config\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.538367 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55vkg\" (UniqueName: \"kubernetes.io/projected/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-kube-api-access-55vkg\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.538442 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.538554 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-dns-svc\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.539299 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-config\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.539332 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.543278 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-dns-svc\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.543374 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.556312 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55vkg\" (UniqueName: \"kubernetes.io/projected/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-kube-api-access-55vkg\") pod \"dnsmasq-dns-566b5b7845-rvg6j\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.665029 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.689216 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.728013 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.780061 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ch92k"] Oct 04 02:58:21 crc kubenswrapper[4964]: W1004 02:58:21.807074 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b2b94b6_a8c2_43d3_b5b6_10e02544fe47.slice/crio-985adeffc767507265d1114802a5a8aee25d742ac195c6187aa921333920d5fe WatchSource:0}: Error finding container 985adeffc767507265d1114802a5a8aee25d742ac195c6187aa921333920d5fe: Status 404 returned error can't find the container with id 985adeffc767507265d1114802a5a8aee25d742ac195c6187aa921333920d5fe Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.950820 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zb6n7"] Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.951971 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.954646 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.954824 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 04 02:58:21 crc kubenswrapper[4964]: I1004 02:58:21.974423 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zb6n7"] Oct 04 02:58:22 crc kubenswrapper[4964]: W1004 02:58:22.009665 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod145964bf_f38c_4a86_8939_6934e9fd3d1b.slice/crio-36f20e83f77dcb4f7bd7cd264dbaf0de2f98d544bdf30e3ceb4731da054ac975 WatchSource:0}: Error finding container 36f20e83f77dcb4f7bd7cd264dbaf0de2f98d544bdf30e3ceb4731da054ac975: Status 404 returned error can't find the container with id 36f20e83f77dcb4f7bd7cd264dbaf0de2f98d544bdf30e3ceb4731da054ac975 Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.026171 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.058490 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zb6n7\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.058752 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-config-data\") pod \"nova-cell1-conductor-db-sync-zb6n7\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.058782 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-scripts\") pod \"nova-cell1-conductor-db-sync-zb6n7\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.058862 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62d7k\" (UniqueName: \"kubernetes.io/projected/edc8e261-b823-4e47-8434-69659d723885-kube-api-access-62d7k\") pod \"nova-cell1-conductor-db-sync-zb6n7\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.086972 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:58:22 crc kubenswrapper[4964]: W1004 02:58:22.091503 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8f3c888_5938_4862_9f0c_3b7b86566f84.slice/crio-dc0a4c31f27f1b58f3ac5d943b277c41c3c4cd5e768803fc606c334f0ab3f833 WatchSource:0}: Error finding container dc0a4c31f27f1b58f3ac5d943b277c41c3c4cd5e768803fc606c334f0ab3f833: Status 404 returned error can't find the container with id dc0a4c31f27f1b58f3ac5d943b277c41c3c4cd5e768803fc606c334f0ab3f833 Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.160047 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zb6n7\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.160125 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-config-data\") pod \"nova-cell1-conductor-db-sync-zb6n7\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.160154 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-scripts\") pod \"nova-cell1-conductor-db-sync-zb6n7\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.160194 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62d7k\" (UniqueName: \"kubernetes.io/projected/edc8e261-b823-4e47-8434-69659d723885-kube-api-access-62d7k\") pod \"nova-cell1-conductor-db-sync-zb6n7\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.163705 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-scripts\") pod \"nova-cell1-conductor-db-sync-zb6n7\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.164211 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-config-data\") pod \"nova-cell1-conductor-db-sync-zb6n7\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.164285 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zb6n7\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.174833 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62d7k\" (UniqueName: \"kubernetes.io/projected/edc8e261-b823-4e47-8434-69659d723885-kube-api-access-62d7k\") pod \"nova-cell1-conductor-db-sync-zb6n7\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.257520 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.278123 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.354371 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-rvg6j"] Oct 04 02:58:22 crc kubenswrapper[4964]: W1004 02:58:22.355925 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0100d1f7_03f9_4c09_a6df_4b30250ccfa4.slice/crio-a0d9730f923f98c0e363cf21a451eefbb038a74f34cfa86688589bf28fef8ea8 WatchSource:0}: Error finding container a0d9730f923f98c0e363cf21a451eefbb038a74f34cfa86688589bf28fef8ea8: Status 404 returned error can't find the container with id a0d9730f923f98c0e363cf21a451eefbb038a74f34cfa86688589bf28fef8ea8 Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.369164 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:22 crc kubenswrapper[4964]: W1004 02:58:22.371205 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ffbe74b_8557_4f46_b7b7_b63a529f07e3.slice/crio-02dccbe42e7647652d8a157c4a1b926478679680c732443e1bb8497bbbd2a9ba WatchSource:0}: Error finding container 02dccbe42e7647652d8a157c4a1b926478679680c732443e1bb8497bbbd2a9ba: Status 404 returned error can't find the container with id 02dccbe42e7647652d8a157c4a1b926478679680c732443e1bb8497bbbd2a9ba Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.716681 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zb6n7"] Oct 04 02:58:22 crc kubenswrapper[4964]: W1004 02:58:22.718733 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedc8e261_b823_4e47_8434_69659d723885.slice/crio-59256a5455cc7806dd030025a286cc3b1245ac99008096520df3d938abaf8c66 WatchSource:0}: Error finding container 59256a5455cc7806dd030025a286cc3b1245ac99008096520df3d938abaf8c66: Status 404 returned error can't find the container with id 59256a5455cc7806dd030025a286cc3b1245ac99008096520df3d938abaf8c66 Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.753111 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5","Type":"ContainerStarted","Data":"219992e9cec7463418042e4278814791a8bea32c85bffe9fb5a68179a52e341a"} Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.754887 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ch92k" event={"ID":"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47","Type":"ContainerStarted","Data":"36174e4e8620c40cd826109a2f529d1fe17bfffa0bf91225f780722d04ad006f"} Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.754944 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ch92k" event={"ID":"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47","Type":"ContainerStarted","Data":"985adeffc767507265d1114802a5a8aee25d742ac195c6187aa921333920d5fe"} Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.756512 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9ffbe74b-8557-4f46-b7b7-b63a529f07e3","Type":"ContainerStarted","Data":"02dccbe42e7647652d8a157c4a1b926478679680c732443e1bb8497bbbd2a9ba"} Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.761110 4964 generic.go:334] "Generic (PLEG): container finished" podID="0100d1f7-03f9-4c09-a6df-4b30250ccfa4" containerID="3a47a534223a9e03e808c28d961d55dbd817912087ed88a5d7c930060c037a0c" exitCode=0 Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.761186 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" event={"ID":"0100d1f7-03f9-4c09-a6df-4b30250ccfa4","Type":"ContainerDied","Data":"3a47a534223a9e03e808c28d961d55dbd817912087ed88a5d7c930060c037a0c"} Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.761211 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" event={"ID":"0100d1f7-03f9-4c09-a6df-4b30250ccfa4","Type":"ContainerStarted","Data":"a0d9730f923f98c0e363cf21a451eefbb038a74f34cfa86688589bf28fef8ea8"} Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.762429 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8f3c888-5938-4862-9f0c-3b7b86566f84","Type":"ContainerStarted","Data":"dc0a4c31f27f1b58f3ac5d943b277c41c3c4cd5e768803fc606c334f0ab3f833"} Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.765579 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"145964bf-f38c-4a86-8939-6934e9fd3d1b","Type":"ContainerStarted","Data":"36f20e83f77dcb4f7bd7cd264dbaf0de2f98d544bdf30e3ceb4731da054ac975"} Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.767924 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zb6n7" event={"ID":"edc8e261-b823-4e47-8434-69659d723885","Type":"ContainerStarted","Data":"59256a5455cc7806dd030025a286cc3b1245ac99008096520df3d938abaf8c66"} Oct 04 02:58:22 crc kubenswrapper[4964]: I1004 02:58:22.779385 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-ch92k" podStartSLOduration=2.779366091 podStartE2EDuration="2.779366091s" podCreationTimestamp="2025-10-04 02:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:58:22.776380142 +0000 UTC m=+1082.673338780" watchObservedRunningTime="2025-10-04 02:58:22.779366091 +0000 UTC m=+1082.676324729" Oct 04 02:58:23 crc kubenswrapper[4964]: I1004 02:58:23.787006 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zb6n7" event={"ID":"edc8e261-b823-4e47-8434-69659d723885","Type":"ContainerStarted","Data":"aadada084f99d6f23988d9418c9ae17e657a6447246d9cd6db110f53606f179c"} Oct 04 02:58:23 crc kubenswrapper[4964]: I1004 02:58:23.791728 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" event={"ID":"0100d1f7-03f9-4c09-a6df-4b30250ccfa4","Type":"ContainerStarted","Data":"78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e"} Oct 04 02:58:23 crc kubenswrapper[4964]: I1004 02:58:23.791956 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:23 crc kubenswrapper[4964]: I1004 02:58:23.811429 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zb6n7" podStartSLOduration=2.81141299 podStartE2EDuration="2.81141299s" podCreationTimestamp="2025-10-04 02:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:58:23.807892236 +0000 UTC m=+1083.704850884" watchObservedRunningTime="2025-10-04 02:58:23.81141299 +0000 UTC m=+1083.708371628" Oct 04 02:58:23 crc kubenswrapper[4964]: I1004 02:58:23.832345 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" podStartSLOduration=2.832327498 podStartE2EDuration="2.832327498s" podCreationTimestamp="2025-10-04 02:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:58:23.825079325 +0000 UTC m=+1083.722037963" watchObservedRunningTime="2025-10-04 02:58:23.832327498 +0000 UTC m=+1083.729286136" Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.015806 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.020173 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.808897 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5","Type":"ContainerStarted","Data":"8f0d133e400581fc931972f43c7d0764d9d640fac6970ab9e6efc9740254826f"} Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.808951 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8f0d133e400581fc931972f43c7d0764d9d640fac6970ab9e6efc9740254826f" gracePeriod=30 Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.813591 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9ffbe74b-8557-4f46-b7b7-b63a529f07e3","Type":"ContainerStarted","Data":"40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa"} Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.813640 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9ffbe74b-8557-4f46-b7b7-b63a529f07e3","Type":"ContainerStarted","Data":"7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd"} Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.813757 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9ffbe74b-8557-4f46-b7b7-b63a529f07e3" containerName="nova-metadata-log" containerID="cri-o://7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd" gracePeriod=30 Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.813847 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9ffbe74b-8557-4f46-b7b7-b63a529f07e3" containerName="nova-metadata-metadata" containerID="cri-o://40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa" gracePeriod=30 Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.816253 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8f3c888-5938-4862-9f0c-3b7b86566f84","Type":"ContainerStarted","Data":"c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2"} Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.819143 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"145964bf-f38c-4a86-8939-6934e9fd3d1b","Type":"ContainerStarted","Data":"4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6"} Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.819181 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"145964bf-f38c-4a86-8939-6934e9fd3d1b","Type":"ContainerStarted","Data":"c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f"} Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.841533 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.359253083 podStartE2EDuration="4.841514926s" podCreationTimestamp="2025-10-04 02:58:21 +0000 UTC" firstStartedPulling="2025-10-04 02:58:22.257344378 +0000 UTC m=+1082.154303016" lastFinishedPulling="2025-10-04 02:58:24.739606221 +0000 UTC m=+1084.636564859" observedRunningTime="2025-10-04 02:58:25.831003516 +0000 UTC m=+1085.727962194" watchObservedRunningTime="2025-10-04 02:58:25.841514926 +0000 UTC m=+1085.738473564" Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.857665 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.128646588 podStartE2EDuration="4.857645587s" podCreationTimestamp="2025-10-04 02:58:21 +0000 UTC" firstStartedPulling="2025-10-04 02:58:22.012362189 +0000 UTC m=+1081.909320827" lastFinishedPulling="2025-10-04 02:58:24.741361188 +0000 UTC m=+1084.638319826" observedRunningTime="2025-10-04 02:58:25.853412355 +0000 UTC m=+1085.750370993" watchObservedRunningTime="2025-10-04 02:58:25.857645587 +0000 UTC m=+1085.754604225" Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.891147 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.526143892 podStartE2EDuration="4.891126242s" podCreationTimestamp="2025-10-04 02:58:21 +0000 UTC" firstStartedPulling="2025-10-04 02:58:22.373313107 +0000 UTC m=+1082.270271745" lastFinishedPulling="2025-10-04 02:58:24.738295457 +0000 UTC m=+1084.635254095" observedRunningTime="2025-10-04 02:58:25.877330624 +0000 UTC m=+1085.774289282" watchObservedRunningTime="2025-10-04 02:58:25.891126242 +0000 UTC m=+1085.788084890" Oct 04 02:58:25 crc kubenswrapper[4964]: I1004 02:58:25.895250 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.231234982 podStartE2EDuration="4.895234943s" podCreationTimestamp="2025-10-04 02:58:21 +0000 UTC" firstStartedPulling="2025-10-04 02:58:22.093590991 +0000 UTC m=+1081.990549629" lastFinishedPulling="2025-10-04 02:58:24.757590962 +0000 UTC m=+1084.654549590" observedRunningTime="2025-10-04 02:58:25.892325335 +0000 UTC m=+1085.789283983" watchObservedRunningTime="2025-10-04 02:58:25.895234943 +0000 UTC m=+1085.792193591" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.484991 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.502766 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.561803 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-logs\") pod \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.561842 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-config-data\") pod \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.561872 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ghsg\" (UniqueName: \"kubernetes.io/projected/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-kube-api-access-6ghsg\") pod \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.561897 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-combined-ca-bundle\") pod \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\" (UID: \"9ffbe74b-8557-4f46-b7b7-b63a529f07e3\") " Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.562979 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-logs" (OuterVolumeSpecName: "logs") pod "9ffbe74b-8557-4f46-b7b7-b63a529f07e3" (UID: "9ffbe74b-8557-4f46-b7b7-b63a529f07e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.569749 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-kube-api-access-6ghsg" (OuterVolumeSpecName: "kube-api-access-6ghsg") pod "9ffbe74b-8557-4f46-b7b7-b63a529f07e3" (UID: "9ffbe74b-8557-4f46-b7b7-b63a529f07e3"). InnerVolumeSpecName "kube-api-access-6ghsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.587750 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-config-data" (OuterVolumeSpecName: "config-data") pod "9ffbe74b-8557-4f46-b7b7-b63a529f07e3" (UID: "9ffbe74b-8557-4f46-b7b7-b63a529f07e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.588454 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ffbe74b-8557-4f46-b7b7-b63a529f07e3" (UID: "9ffbe74b-8557-4f46-b7b7-b63a529f07e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.664131 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-logs\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.664166 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.664177 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ghsg\" (UniqueName: \"kubernetes.io/projected/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-kube-api-access-6ghsg\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.664188 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffbe74b-8557-4f46-b7b7-b63a529f07e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.666088 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.829922 4964 generic.go:334] "Generic (PLEG): container finished" podID="9ffbe74b-8557-4f46-b7b7-b63a529f07e3" containerID="40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa" exitCode=0 Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.829983 4964 generic.go:334] "Generic (PLEG): container finished" podID="9ffbe74b-8557-4f46-b7b7-b63a529f07e3" containerID="7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd" exitCode=143 Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.830036 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.830089 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9ffbe74b-8557-4f46-b7b7-b63a529f07e3","Type":"ContainerDied","Data":"40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa"} Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.830130 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9ffbe74b-8557-4f46-b7b7-b63a529f07e3","Type":"ContainerDied","Data":"7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd"} Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.830152 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9ffbe74b-8557-4f46-b7b7-b63a529f07e3","Type":"ContainerDied","Data":"02dccbe42e7647652d8a157c4a1b926478679680c732443e1bb8497bbbd2a9ba"} Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.830176 4964 scope.go:117] "RemoveContainer" containerID="40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.851750 4964 scope.go:117] "RemoveContainer" containerID="7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.893886 4964 scope.go:117] "RemoveContainer" containerID="40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa" Oct 04 02:58:26 crc kubenswrapper[4964]: E1004 02:58:26.894730 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa\": container with ID starting with 40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa not found: ID does not exist" containerID="40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.894775 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa"} err="failed to get container status \"40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa\": rpc error: code = NotFound desc = could not find container \"40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa\": container with ID starting with 40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa not found: ID does not exist" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.894802 4964 scope.go:117] "RemoveContainer" containerID="7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd" Oct 04 02:58:26 crc kubenswrapper[4964]: E1004 02:58:26.895223 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd\": container with ID starting with 7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd not found: ID does not exist" containerID="7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.895252 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd"} err="failed to get container status \"7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd\": rpc error: code = NotFound desc = could not find container \"7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd\": container with ID starting with 7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd not found: ID does not exist" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.895270 4964 scope.go:117] "RemoveContainer" containerID="40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.895645 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa"} err="failed to get container status \"40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa\": rpc error: code = NotFound desc = could not find container \"40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa\": container with ID starting with 40bb13f4e393bf16a36c44e6c689231354fe2d057f68c48d93dab02c541841aa not found: ID does not exist" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.895689 4964 scope.go:117] "RemoveContainer" containerID="7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.896729 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd"} err="failed to get container status \"7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd\": rpc error: code = NotFound desc = could not find container \"7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd\": container with ID starting with 7f778dd506d1605ea83570f329417d046394696d74d5853c26563835749f99cd not found: ID does not exist" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.904650 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.913540 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.922826 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:26 crc kubenswrapper[4964]: E1004 02:58:26.923242 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffbe74b-8557-4f46-b7b7-b63a529f07e3" containerName="nova-metadata-metadata" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.923266 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffbe74b-8557-4f46-b7b7-b63a529f07e3" containerName="nova-metadata-metadata" Oct 04 02:58:26 crc kubenswrapper[4964]: E1004 02:58:26.923303 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffbe74b-8557-4f46-b7b7-b63a529f07e3" containerName="nova-metadata-log" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.923312 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffbe74b-8557-4f46-b7b7-b63a529f07e3" containerName="nova-metadata-log" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.923513 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ffbe74b-8557-4f46-b7b7-b63a529f07e3" containerName="nova-metadata-metadata" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.923542 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ffbe74b-8557-4f46-b7b7-b63a529f07e3" containerName="nova-metadata-log" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.928381 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.930144 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.930270 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.933724 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.970965 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fcfecfe-b0e6-47a9-a719-00498983c990-logs\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.971043 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.971304 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.971419 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-config-data\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:26 crc kubenswrapper[4964]: I1004 02:58:26.971658 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlv78\" (UniqueName: \"kubernetes.io/projected/6fcfecfe-b0e6-47a9-a719-00498983c990-kube-api-access-qlv78\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.073701 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlv78\" (UniqueName: \"kubernetes.io/projected/6fcfecfe-b0e6-47a9-a719-00498983c990-kube-api-access-qlv78\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.073745 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fcfecfe-b0e6-47a9-a719-00498983c990-logs\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.073795 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.073870 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.073904 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-config-data\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.074231 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fcfecfe-b0e6-47a9-a719-00498983c990-logs\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.079144 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.082126 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-config-data\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.090731 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.095116 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlv78\" (UniqueName: \"kubernetes.io/projected/6fcfecfe-b0e6-47a9-a719-00498983c990-kube-api-access-qlv78\") pod \"nova-metadata-0\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " pod="openstack/nova-metadata-0" Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.251093 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.717094 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:27 crc kubenswrapper[4964]: W1004 02:58:27.727995 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fcfecfe_b0e6_47a9_a719_00498983c990.slice/crio-e31b149f7c27ca0767a1f82d332c38befb3060351d9a9c4a75780b0d9a0b951c WatchSource:0}: Error finding container e31b149f7c27ca0767a1f82d332c38befb3060351d9a9c4a75780b0d9a0b951c: Status 404 returned error can't find the container with id e31b149f7c27ca0767a1f82d332c38befb3060351d9a9c4a75780b0d9a0b951c Oct 04 02:58:27 crc kubenswrapper[4964]: I1004 02:58:27.841644 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6fcfecfe-b0e6-47a9-a719-00498983c990","Type":"ContainerStarted","Data":"e31b149f7c27ca0767a1f82d332c38befb3060351d9a9c4a75780b0d9a0b951c"} Oct 04 02:58:28 crc kubenswrapper[4964]: I1004 02:58:28.855264 4964 generic.go:334] "Generic (PLEG): container finished" podID="3b2b94b6-a8c2-43d3-b5b6-10e02544fe47" containerID="36174e4e8620c40cd826109a2f529d1fe17bfffa0bf91225f780722d04ad006f" exitCode=0 Oct 04 02:58:28 crc kubenswrapper[4964]: I1004 02:58:28.855919 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffbe74b-8557-4f46-b7b7-b63a529f07e3" path="/var/lib/kubelet/pods/9ffbe74b-8557-4f46-b7b7-b63a529f07e3/volumes" Oct 04 02:58:28 crc kubenswrapper[4964]: I1004 02:58:28.856607 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6fcfecfe-b0e6-47a9-a719-00498983c990","Type":"ContainerStarted","Data":"4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f"} Oct 04 02:58:28 crc kubenswrapper[4964]: I1004 02:58:28.856657 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6fcfecfe-b0e6-47a9-a719-00498983c990","Type":"ContainerStarted","Data":"531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731"} Oct 04 02:58:28 crc kubenswrapper[4964]: I1004 02:58:28.856670 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ch92k" event={"ID":"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47","Type":"ContainerDied","Data":"36174e4e8620c40cd826109a2f529d1fe17bfffa0bf91225f780722d04ad006f"} Oct 04 02:58:28 crc kubenswrapper[4964]: I1004 02:58:28.888911 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.888891786 podStartE2EDuration="2.888891786s" podCreationTimestamp="2025-10-04 02:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:58:28.883662336 +0000 UTC m=+1088.780620984" watchObservedRunningTime="2025-10-04 02:58:28.888891786 +0000 UTC m=+1088.785850434" Oct 04 02:58:29 crc kubenswrapper[4964]: I1004 02:58:29.871198 4964 generic.go:334] "Generic (PLEG): container finished" podID="edc8e261-b823-4e47-8434-69659d723885" containerID="aadada084f99d6f23988d9418c9ae17e657a6447246d9cd6db110f53606f179c" exitCode=0 Oct 04 02:58:29 crc kubenswrapper[4964]: I1004 02:58:29.871325 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zb6n7" event={"ID":"edc8e261-b823-4e47-8434-69659d723885","Type":"ContainerDied","Data":"aadada084f99d6f23988d9418c9ae17e657a6447246d9cd6db110f53606f179c"} Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.298435 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.339735 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b86fv\" (UniqueName: \"kubernetes.io/projected/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-kube-api-access-b86fv\") pod \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.340056 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-combined-ca-bundle\") pod \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.340100 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-scripts\") pod \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.340810 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-config-data\") pod \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\" (UID: \"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47\") " Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.345230 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-scripts" (OuterVolumeSpecName: "scripts") pod "3b2b94b6-a8c2-43d3-b5b6-10e02544fe47" (UID: "3b2b94b6-a8c2-43d3-b5b6-10e02544fe47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.346475 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-kube-api-access-b86fv" (OuterVolumeSpecName: "kube-api-access-b86fv") pod "3b2b94b6-a8c2-43d3-b5b6-10e02544fe47" (UID: "3b2b94b6-a8c2-43d3-b5b6-10e02544fe47"). InnerVolumeSpecName "kube-api-access-b86fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.375746 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b2b94b6-a8c2-43d3-b5b6-10e02544fe47" (UID: "3b2b94b6-a8c2-43d3-b5b6-10e02544fe47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.376554 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-config-data" (OuterVolumeSpecName: "config-data") pod "3b2b94b6-a8c2-43d3-b5b6-10e02544fe47" (UID: "3b2b94b6-a8c2-43d3-b5b6-10e02544fe47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.443087 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.443116 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.443125 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.443135 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b86fv\" (UniqueName: \"kubernetes.io/projected/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47-kube-api-access-b86fv\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.886750 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ch92k" event={"ID":"3b2b94b6-a8c2-43d3-b5b6-10e02544fe47","Type":"ContainerDied","Data":"985adeffc767507265d1114802a5a8aee25d742ac195c6187aa921333920d5fe"} Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.886770 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ch92k" Oct 04 02:58:30 crc kubenswrapper[4964]: I1004 02:58:30.886792 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="985adeffc767507265d1114802a5a8aee25d742ac195c6187aa921333920d5fe" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.123813 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.124425 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c8f3c888-5938-4862-9f0c-3b7b86566f84" containerName="nova-scheduler-scheduler" containerID="cri-o://c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2" gracePeriod=30 Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.144737 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.145040 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="145964bf-f38c-4a86-8939-6934e9fd3d1b" containerName="nova-api-log" containerID="cri-o://c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f" gracePeriod=30 Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.145327 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="145964bf-f38c-4a86-8939-6934e9fd3d1b" containerName="nova-api-api" containerID="cri-o://4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6" gracePeriod=30 Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.163875 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.164107 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6fcfecfe-b0e6-47a9-a719-00498983c990" containerName="nova-metadata-log" containerID="cri-o://531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731" gracePeriod=30 Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.164571 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6fcfecfe-b0e6-47a9-a719-00498983c990" containerName="nova-metadata-metadata" containerID="cri-o://4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f" gracePeriod=30 Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.554289 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.670409 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-config-data\") pod \"edc8e261-b823-4e47-8434-69659d723885\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.670454 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62d7k\" (UniqueName: \"kubernetes.io/projected/edc8e261-b823-4e47-8434-69659d723885-kube-api-access-62d7k\") pod \"edc8e261-b823-4e47-8434-69659d723885\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.670490 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-combined-ca-bundle\") pod \"edc8e261-b823-4e47-8434-69659d723885\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.670713 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-scripts\") pod \"edc8e261-b823-4e47-8434-69659d723885\" (UID: \"edc8e261-b823-4e47-8434-69659d723885\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.677926 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc8e261-b823-4e47-8434-69659d723885-kube-api-access-62d7k" (OuterVolumeSpecName: "kube-api-access-62d7k") pod "edc8e261-b823-4e47-8434-69659d723885" (UID: "edc8e261-b823-4e47-8434-69659d723885"). InnerVolumeSpecName "kube-api-access-62d7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.682076 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-scripts" (OuterVolumeSpecName: "scripts") pod "edc8e261-b823-4e47-8434-69659d723885" (UID: "edc8e261-b823-4e47-8434-69659d723885"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.698249 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-config-data" (OuterVolumeSpecName: "config-data") pod "edc8e261-b823-4e47-8434-69659d723885" (UID: "edc8e261-b823-4e47-8434-69659d723885"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.699688 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edc8e261-b823-4e47-8434-69659d723885" (UID: "edc8e261-b823-4e47-8434-69659d723885"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.732764 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.749708 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.754946 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.772520 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.772559 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.772569 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc8e261-b823-4e47-8434-69659d723885-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.772577 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62d7k\" (UniqueName: \"kubernetes.io/projected/edc8e261-b823-4e47-8434-69659d723885-kube-api-access-62d7k\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.787276 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-2rr95"] Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.787543 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" podUID="697e704a-98d7-40a5-b685-d46635ebe33d" containerName="dnsmasq-dns" containerID="cri-o://cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3" gracePeriod=10 Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.873410 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/145964bf-f38c-4a86-8939-6934e9fd3d1b-logs\") pod \"145964bf-f38c-4a86-8939-6934e9fd3d1b\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.873853 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/145964bf-f38c-4a86-8939-6934e9fd3d1b-logs" (OuterVolumeSpecName: "logs") pod "145964bf-f38c-4a86-8939-6934e9fd3d1b" (UID: "145964bf-f38c-4a86-8939-6934e9fd3d1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.874362 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fcfecfe-b0e6-47a9-a719-00498983c990-logs\") pod \"6fcfecfe-b0e6-47a9-a719-00498983c990\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.874966 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fcfecfe-b0e6-47a9-a719-00498983c990-logs" (OuterVolumeSpecName: "logs") pod "6fcfecfe-b0e6-47a9-a719-00498983c990" (UID: "6fcfecfe-b0e6-47a9-a719-00498983c990"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.875173 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ws2f\" (UniqueName: \"kubernetes.io/projected/145964bf-f38c-4a86-8939-6934e9fd3d1b-kube-api-access-4ws2f\") pod \"145964bf-f38c-4a86-8939-6934e9fd3d1b\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.876029 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-config-data\") pod \"6fcfecfe-b0e6-47a9-a719-00498983c990\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.876487 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145964bf-f38c-4a86-8939-6934e9fd3d1b-combined-ca-bundle\") pod \"145964bf-f38c-4a86-8939-6934e9fd3d1b\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.876541 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-nova-metadata-tls-certs\") pod \"6fcfecfe-b0e6-47a9-a719-00498983c990\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.876573 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-combined-ca-bundle\") pod \"6fcfecfe-b0e6-47a9-a719-00498983c990\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.876841 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145964bf-f38c-4a86-8939-6934e9fd3d1b-config-data\") pod \"145964bf-f38c-4a86-8939-6934e9fd3d1b\" (UID: \"145964bf-f38c-4a86-8939-6934e9fd3d1b\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.876874 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlv78\" (UniqueName: \"kubernetes.io/projected/6fcfecfe-b0e6-47a9-a719-00498983c990-kube-api-access-qlv78\") pod \"6fcfecfe-b0e6-47a9-a719-00498983c990\" (UID: \"6fcfecfe-b0e6-47a9-a719-00498983c990\") " Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.877701 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fcfecfe-b0e6-47a9-a719-00498983c990-logs\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.877728 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/145964bf-f38c-4a86-8939-6934e9fd3d1b-logs\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.879874 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/145964bf-f38c-4a86-8939-6934e9fd3d1b-kube-api-access-4ws2f" (OuterVolumeSpecName: "kube-api-access-4ws2f") pod "145964bf-f38c-4a86-8939-6934e9fd3d1b" (UID: "145964bf-f38c-4a86-8939-6934e9fd3d1b"). InnerVolumeSpecName "kube-api-access-4ws2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.880839 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fcfecfe-b0e6-47a9-a719-00498983c990-kube-api-access-qlv78" (OuterVolumeSpecName: "kube-api-access-qlv78") pod "6fcfecfe-b0e6-47a9-a719-00498983c990" (UID: "6fcfecfe-b0e6-47a9-a719-00498983c990"). InnerVolumeSpecName "kube-api-access-qlv78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.896738 4964 generic.go:334] "Generic (PLEG): container finished" podID="145964bf-f38c-4a86-8939-6934e9fd3d1b" containerID="4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6" exitCode=0 Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.896772 4964 generic.go:334] "Generic (PLEG): container finished" podID="145964bf-f38c-4a86-8939-6934e9fd3d1b" containerID="c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f" exitCode=143 Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.896817 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"145964bf-f38c-4a86-8939-6934e9fd3d1b","Type":"ContainerDied","Data":"4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6"} Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.896852 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"145964bf-f38c-4a86-8939-6934e9fd3d1b","Type":"ContainerDied","Data":"c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f"} Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.896864 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"145964bf-f38c-4a86-8939-6934e9fd3d1b","Type":"ContainerDied","Data":"36f20e83f77dcb4f7bd7cd264dbaf0de2f98d544bdf30e3ceb4731da054ac975"} Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.896882 4964 scope.go:117] "RemoveContainer" containerID="4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.896884 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.901878 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zb6n7" event={"ID":"edc8e261-b823-4e47-8434-69659d723885","Type":"ContainerDied","Data":"59256a5455cc7806dd030025a286cc3b1245ac99008096520df3d938abaf8c66"} Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.902120 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59256a5455cc7806dd030025a286cc3b1245ac99008096520df3d938abaf8c66" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.902319 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zb6n7" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.905484 4964 generic.go:334] "Generic (PLEG): container finished" podID="6fcfecfe-b0e6-47a9-a719-00498983c990" containerID="4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f" exitCode=0 Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.905510 4964 generic.go:334] "Generic (PLEG): container finished" podID="6fcfecfe-b0e6-47a9-a719-00498983c990" containerID="531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731" exitCode=143 Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.905530 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6fcfecfe-b0e6-47a9-a719-00498983c990","Type":"ContainerDied","Data":"4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f"} Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.905553 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6fcfecfe-b0e6-47a9-a719-00498983c990","Type":"ContainerDied","Data":"531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731"} Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.905563 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6fcfecfe-b0e6-47a9-a719-00498983c990","Type":"ContainerDied","Data":"e31b149f7c27ca0767a1f82d332c38befb3060351d9a9c4a75780b0d9a0b951c"} Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.905532 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.911179 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145964bf-f38c-4a86-8939-6934e9fd3d1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "145964bf-f38c-4a86-8939-6934e9fd3d1b" (UID: "145964bf-f38c-4a86-8939-6934e9fd3d1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.926549 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fcfecfe-b0e6-47a9-a719-00498983c990" (UID: "6fcfecfe-b0e6-47a9-a719-00498983c990"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.926850 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-config-data" (OuterVolumeSpecName: "config-data") pod "6fcfecfe-b0e6-47a9-a719-00498983c990" (UID: "6fcfecfe-b0e6-47a9-a719-00498983c990"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.944678 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145964bf-f38c-4a86-8939-6934e9fd3d1b-config-data" (OuterVolumeSpecName: "config-data") pod "145964bf-f38c-4a86-8939-6934e9fd3d1b" (UID: "145964bf-f38c-4a86-8939-6934e9fd3d1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.962507 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6fcfecfe-b0e6-47a9-a719-00498983c990" (UID: "6fcfecfe-b0e6-47a9-a719-00498983c990"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.969311 4964 scope.go:117] "RemoveContainer" containerID="c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976280 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 04 02:58:31 crc kubenswrapper[4964]: E1004 02:58:31.976656 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145964bf-f38c-4a86-8939-6934e9fd3d1b" containerName="nova-api-api" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976674 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="145964bf-f38c-4a86-8939-6934e9fd3d1b" containerName="nova-api-api" Oct 04 02:58:31 crc kubenswrapper[4964]: E1004 02:58:31.976692 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc8e261-b823-4e47-8434-69659d723885" containerName="nova-cell1-conductor-db-sync" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976698 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc8e261-b823-4e47-8434-69659d723885" containerName="nova-cell1-conductor-db-sync" Oct 04 02:58:31 crc kubenswrapper[4964]: E1004 02:58:31.976707 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2b94b6-a8c2-43d3-b5b6-10e02544fe47" containerName="nova-manage" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976713 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2b94b6-a8c2-43d3-b5b6-10e02544fe47" containerName="nova-manage" Oct 04 02:58:31 crc kubenswrapper[4964]: E1004 02:58:31.976721 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fcfecfe-b0e6-47a9-a719-00498983c990" containerName="nova-metadata-metadata" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976726 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fcfecfe-b0e6-47a9-a719-00498983c990" containerName="nova-metadata-metadata" Oct 04 02:58:31 crc kubenswrapper[4964]: E1004 02:58:31.976746 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145964bf-f38c-4a86-8939-6934e9fd3d1b" containerName="nova-api-log" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976752 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="145964bf-f38c-4a86-8939-6934e9fd3d1b" containerName="nova-api-log" Oct 04 02:58:31 crc kubenswrapper[4964]: E1004 02:58:31.976771 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fcfecfe-b0e6-47a9-a719-00498983c990" containerName="nova-metadata-log" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976778 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fcfecfe-b0e6-47a9-a719-00498983c990" containerName="nova-metadata-log" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976921 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc8e261-b823-4e47-8434-69659d723885" containerName="nova-cell1-conductor-db-sync" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976938 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="145964bf-f38c-4a86-8939-6934e9fd3d1b" containerName="nova-api-api" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976947 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="145964bf-f38c-4a86-8939-6934e9fd3d1b" containerName="nova-api-log" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976956 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fcfecfe-b0e6-47a9-a719-00498983c990" containerName="nova-metadata-metadata" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976964 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fcfecfe-b0e6-47a9-a719-00498983c990" containerName="nova-metadata-log" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.976973 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b2b94b6-a8c2-43d3-b5b6-10e02544fe47" containerName="nova-manage" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.977523 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.978947 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ws2f\" (UniqueName: \"kubernetes.io/projected/145964bf-f38c-4a86-8939-6934e9fd3d1b-kube-api-access-4ws2f\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.978976 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.978985 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145964bf-f38c-4a86-8939-6934e9fd3d1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.978993 4964 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.979002 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fcfecfe-b0e6-47a9-a719-00498983c990-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.979009 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145964bf-f38c-4a86-8939-6934e9fd3d1b-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.979017 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlv78\" (UniqueName: \"kubernetes.io/projected/6fcfecfe-b0e6-47a9-a719-00498983c990-kube-api-access-qlv78\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:31 crc kubenswrapper[4964]: I1004 02:58:31.979732 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.000954 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.031395 4964 scope.go:117] "RemoveContainer" containerID="4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6" Oct 04 02:58:32 crc kubenswrapper[4964]: E1004 02:58:32.032819 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6\": container with ID starting with 4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6 not found: ID does not exist" containerID="4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.032872 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6"} err="failed to get container status \"4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6\": rpc error: code = NotFound desc = could not find container \"4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6\": container with ID starting with 4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6 not found: ID does not exist" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.032902 4964 scope.go:117] "RemoveContainer" containerID="c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f" Oct 04 02:58:32 crc kubenswrapper[4964]: E1004 02:58:32.043054 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f\": container with ID starting with c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f not found: ID does not exist" containerID="c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.043114 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f"} err="failed to get container status \"c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f\": rpc error: code = NotFound desc = could not find container \"c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f\": container with ID starting with c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f not found: ID does not exist" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.043143 4964 scope.go:117] "RemoveContainer" containerID="4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.043478 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6"} err="failed to get container status \"4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6\": rpc error: code = NotFound desc = could not find container \"4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6\": container with ID starting with 4219957c94b43799e07825b7294f67c873c14cd788a1a8446ee840da343dd5e6 not found: ID does not exist" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.043496 4964 scope.go:117] "RemoveContainer" containerID="c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.044856 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f"} err="failed to get container status \"c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f\": rpc error: code = NotFound desc = could not find container \"c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f\": container with ID starting with c4dfde3a3a34a6005eb6377577eaceb81f731320b8ffc008b55e7755074bd88f not found: ID does not exist" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.044896 4964 scope.go:117] "RemoveContainer" containerID="4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.076910 4964 scope.go:117] "RemoveContainer" containerID="531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.081574 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b760ccf-f6fc-4396-8504-38e08c6d1737-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1b760ccf-f6fc-4396-8504-38e08c6d1737\") " pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.081651 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b760ccf-f6fc-4396-8504-38e08c6d1737-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1b760ccf-f6fc-4396-8504-38e08c6d1737\") " pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.081822 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk7b7\" (UniqueName: \"kubernetes.io/projected/1b760ccf-f6fc-4396-8504-38e08c6d1737-kube-api-access-jk7b7\") pod \"nova-cell1-conductor-0\" (UID: \"1b760ccf-f6fc-4396-8504-38e08c6d1737\") " pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.110868 4964 scope.go:117] "RemoveContainer" containerID="4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f" Oct 04 02:58:32 crc kubenswrapper[4964]: E1004 02:58:32.111382 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f\": container with ID starting with 4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f not found: ID does not exist" containerID="4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.111425 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f"} err="failed to get container status \"4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f\": rpc error: code = NotFound desc = could not find container \"4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f\": container with ID starting with 4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f not found: ID does not exist" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.111443 4964 scope.go:117] "RemoveContainer" containerID="531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731" Oct 04 02:58:32 crc kubenswrapper[4964]: E1004 02:58:32.111958 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731\": container with ID starting with 531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731 not found: ID does not exist" containerID="531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.111982 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731"} err="failed to get container status \"531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731\": rpc error: code = NotFound desc = could not find container \"531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731\": container with ID starting with 531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731 not found: ID does not exist" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.111996 4964 scope.go:117] "RemoveContainer" containerID="4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.112215 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f"} err="failed to get container status \"4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f\": rpc error: code = NotFound desc = could not find container \"4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f\": container with ID starting with 4f252fbb23a33c73e6d958a125ec46515d5619b7f4326610b1fcfe05779f828f not found: ID does not exist" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.112232 4964 scope.go:117] "RemoveContainer" containerID="531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.112458 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731"} err="failed to get container status \"531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731\": rpc error: code = NotFound desc = could not find container \"531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731\": container with ID starting with 531f28e47a1984abe0eed4310c42fbe649da94738ed5d8eae9705f2ef00fe731 not found: ID does not exist" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.183327 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk7b7\" (UniqueName: \"kubernetes.io/projected/1b760ccf-f6fc-4396-8504-38e08c6d1737-kube-api-access-jk7b7\") pod \"nova-cell1-conductor-0\" (UID: \"1b760ccf-f6fc-4396-8504-38e08c6d1737\") " pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.183654 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b760ccf-f6fc-4396-8504-38e08c6d1737-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1b760ccf-f6fc-4396-8504-38e08c6d1737\") " pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.183715 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b760ccf-f6fc-4396-8504-38e08c6d1737-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1b760ccf-f6fc-4396-8504-38e08c6d1737\") " pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.187372 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b760ccf-f6fc-4396-8504-38e08c6d1737-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1b760ccf-f6fc-4396-8504-38e08c6d1737\") " pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.187533 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b760ccf-f6fc-4396-8504-38e08c6d1737-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1b760ccf-f6fc-4396-8504-38e08c6d1737\") " pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.206905 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk7b7\" (UniqueName: \"kubernetes.io/projected/1b760ccf-f6fc-4396-8504-38e08c6d1737-kube-api-access-jk7b7\") pod \"nova-cell1-conductor-0\" (UID: \"1b760ccf-f6fc-4396-8504-38e08c6d1737\") " pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.234561 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.249130 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.257722 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.264215 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.272167 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.282820 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 04 02:58:32 crc kubenswrapper[4964]: E1004 02:58:32.283715 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="697e704a-98d7-40a5-b685-d46635ebe33d" containerName="dnsmasq-dns" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.283738 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="697e704a-98d7-40a5-b685-d46635ebe33d" containerName="dnsmasq-dns" Oct 04 02:58:32 crc kubenswrapper[4964]: E1004 02:58:32.283766 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="697e704a-98d7-40a5-b685-d46635ebe33d" containerName="init" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.283775 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="697e704a-98d7-40a5-b685-d46635ebe33d" containerName="init" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.284086 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="697e704a-98d7-40a5-b685-d46635ebe33d" containerName="dnsmasq-dns" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.285479 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.285743 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-config\") pod \"697e704a-98d7-40a5-b685-d46635ebe33d\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.285845 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-ovsdbserver-nb\") pod \"697e704a-98d7-40a5-b685-d46635ebe33d\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.285884 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-dns-svc\") pod \"697e704a-98d7-40a5-b685-d46635ebe33d\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.285969 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbrdt\" (UniqueName: \"kubernetes.io/projected/697e704a-98d7-40a5-b685-d46635ebe33d-kube-api-access-jbrdt\") pod \"697e704a-98d7-40a5-b685-d46635ebe33d\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.286106 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-ovsdbserver-sb\") pod \"697e704a-98d7-40a5-b685-d46635ebe33d\" (UID: \"697e704a-98d7-40a5-b685-d46635ebe33d\") " Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.289280 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.291914 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.299906 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/697e704a-98d7-40a5-b685-d46635ebe33d-kube-api-access-jbrdt" (OuterVolumeSpecName: "kube-api-access-jbrdt") pod "697e704a-98d7-40a5-b685-d46635ebe33d" (UID: "697e704a-98d7-40a5-b685-d46635ebe33d"). InnerVolumeSpecName "kube-api-access-jbrdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.306895 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.322045 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.328534 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.328939 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.336114 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.340284 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.386116 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "697e704a-98d7-40a5-b685-d46635ebe33d" (UID: "697e704a-98d7-40a5-b685-d46635ebe33d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.387938 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-logs\") pod \"nova-api-0\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.388018 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-logs\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.388043 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.388064 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.388090 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcx8j\" (UniqueName: \"kubernetes.io/projected/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-kube-api-access-hcx8j\") pod \"nova-api-0\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.388119 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjkqn\" (UniqueName: \"kubernetes.io/projected/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-kube-api-access-bjkqn\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.388136 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.388154 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-config-data\") pod \"nova-api-0\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.388277 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-config-data\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.388361 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbrdt\" (UniqueName: \"kubernetes.io/projected/697e704a-98d7-40a5-b685-d46635ebe33d-kube-api-access-jbrdt\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.388371 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.394734 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "697e704a-98d7-40a5-b685-d46635ebe33d" (UID: "697e704a-98d7-40a5-b685-d46635ebe33d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.403122 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-config" (OuterVolumeSpecName: "config") pod "697e704a-98d7-40a5-b685-d46635ebe33d" (UID: "697e704a-98d7-40a5-b685-d46635ebe33d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.488650 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "697e704a-98d7-40a5-b685-d46635ebe33d" (UID: "697e704a-98d7-40a5-b685-d46635ebe33d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.489976 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-config-data\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.490060 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-logs\") pod \"nova-api-0\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.490113 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-logs\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.490136 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.490153 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.490171 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcx8j\" (UniqueName: \"kubernetes.io/projected/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-kube-api-access-hcx8j\") pod \"nova-api-0\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.490189 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjkqn\" (UniqueName: \"kubernetes.io/projected/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-kube-api-access-bjkqn\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.490204 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.490220 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-config-data\") pod \"nova-api-0\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.490285 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.490298 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.490308 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/697e704a-98d7-40a5-b685-d46635ebe33d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.495258 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.495987 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-config-data\") pod \"nova-api-0\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.497013 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-logs\") pod \"nova-api-0\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.497235 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-logs\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.498282 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-config-data\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.498371 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.498943 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.509057 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcx8j\" (UniqueName: \"kubernetes.io/projected/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-kube-api-access-hcx8j\") pod \"nova-api-0\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.511653 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjkqn\" (UniqueName: \"kubernetes.io/projected/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-kube-api-access-bjkqn\") pod \"nova-metadata-0\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.540060 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.547492 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.786953 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.859727 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="145964bf-f38c-4a86-8939-6934e9fd3d1b" path="/var/lib/kubelet/pods/145964bf-f38c-4a86-8939-6934e9fd3d1b/volumes" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.860599 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fcfecfe-b0e6-47a9-a719-00498983c990" path="/var/lib/kubelet/pods/6fcfecfe-b0e6-47a9-a719-00498983c990/volumes" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.896803 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f3c888-5938-4862-9f0c-3b7b86566f84-config-data\") pod \"c8f3c888-5938-4862-9f0c-3b7b86566f84\" (UID: \"c8f3c888-5938-4862-9f0c-3b7b86566f84\") " Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.896969 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f3c888-5938-4862-9f0c-3b7b86566f84-combined-ca-bundle\") pod \"c8f3c888-5938-4862-9f0c-3b7b86566f84\" (UID: \"c8f3c888-5938-4862-9f0c-3b7b86566f84\") " Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.897059 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6zr6\" (UniqueName: \"kubernetes.io/projected/c8f3c888-5938-4862-9f0c-3b7b86566f84-kube-api-access-g6zr6\") pod \"c8f3c888-5938-4862-9f0c-3b7b86566f84\" (UID: \"c8f3c888-5938-4862-9f0c-3b7b86566f84\") " Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.901713 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8f3c888-5938-4862-9f0c-3b7b86566f84-kube-api-access-g6zr6" (OuterVolumeSpecName: "kube-api-access-g6zr6") pod "c8f3c888-5938-4862-9f0c-3b7b86566f84" (UID: "c8f3c888-5938-4862-9f0c-3b7b86566f84"). InnerVolumeSpecName "kube-api-access-g6zr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.922964 4964 generic.go:334] "Generic (PLEG): container finished" podID="c8f3c888-5938-4862-9f0c-3b7b86566f84" containerID="c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2" exitCode=0 Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.923068 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8f3c888-5938-4862-9f0c-3b7b86566f84","Type":"ContainerDied","Data":"c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2"} Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.923117 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8f3c888-5938-4862-9f0c-3b7b86566f84","Type":"ContainerDied","Data":"dc0a4c31f27f1b58f3ac5d943b277c41c3c4cd5e768803fc606c334f0ab3f833"} Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.923141 4964 scope.go:117] "RemoveContainer" containerID="c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.923347 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.933208 4964 generic.go:334] "Generic (PLEG): container finished" podID="697e704a-98d7-40a5-b685-d46635ebe33d" containerID="cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3" exitCode=0 Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.933291 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" event={"ID":"697e704a-98d7-40a5-b685-d46635ebe33d","Type":"ContainerDied","Data":"cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3"} Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.933319 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" event={"ID":"697e704a-98d7-40a5-b685-d46635ebe33d","Type":"ContainerDied","Data":"a9c4e4af205c3af8ba3bd41b558c1cee566634756fd910b5b3f718eabf547325"} Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.933316 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-2rr95" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.939373 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8f3c888-5938-4862-9f0c-3b7b86566f84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8f3c888-5938-4862-9f0c-3b7b86566f84" (UID: "c8f3c888-5938-4862-9f0c-3b7b86566f84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.940114 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8f3c888-5938-4862-9f0c-3b7b86566f84-config-data" (OuterVolumeSpecName: "config-data") pod "c8f3c888-5938-4862-9f0c-3b7b86566f84" (UID: "c8f3c888-5938-4862-9f0c-3b7b86566f84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.961202 4964 scope.go:117] "RemoveContainer" containerID="c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2" Oct 04 02:58:32 crc kubenswrapper[4964]: E1004 02:58:32.961579 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2\": container with ID starting with c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2 not found: ID does not exist" containerID="c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.961666 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2"} err="failed to get container status \"c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2\": rpc error: code = NotFound desc = could not find container \"c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2\": container with ID starting with c9c13c261560dd78a5b34d75c884f272d7a55d96651af2b9d544463d41928ee2 not found: ID does not exist" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.961693 4964 scope.go:117] "RemoveContainer" containerID="cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.967425 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-2rr95"] Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.982835 4964 scope.go:117] "RemoveContainer" containerID="e07ffb6d953f869c63d66b38bc1b8dd4160d67531d04cab9e12a04cadc9012ea" Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.985023 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-2rr95"] Oct 04 02:58:32 crc kubenswrapper[4964]: I1004 02:58:32.995997 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.003469 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f3c888-5938-4862-9f0c-3b7b86566f84-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.006913 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f3c888-5938-4862-9f0c-3b7b86566f84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.007078 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6zr6\" (UniqueName: \"kubernetes.io/projected/c8f3c888-5938-4862-9f0c-3b7b86566f84-kube-api-access-g6zr6\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.006834 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.010888 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.013895 4964 scope.go:117] "RemoveContainer" containerID="cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3" Oct 04 02:58:33 crc kubenswrapper[4964]: E1004 02:58:33.014433 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3\": container with ID starting with cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3 not found: ID does not exist" containerID="cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.014533 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3"} err="failed to get container status \"cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3\": rpc error: code = NotFound desc = could not find container \"cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3\": container with ID starting with cf49bbb7d96dac475f7539f5096c440a0502a63733f0e17d73401e5cc66c5ee3 not found: ID does not exist" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.014592 4964 scope.go:117] "RemoveContainer" containerID="e07ffb6d953f869c63d66b38bc1b8dd4160d67531d04cab9e12a04cadc9012ea" Oct 04 02:58:33 crc kubenswrapper[4964]: E1004 02:58:33.015071 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e07ffb6d953f869c63d66b38bc1b8dd4160d67531d04cab9e12a04cadc9012ea\": container with ID starting with e07ffb6d953f869c63d66b38bc1b8dd4160d67531d04cab9e12a04cadc9012ea not found: ID does not exist" containerID="e07ffb6d953f869c63d66b38bc1b8dd4160d67531d04cab9e12a04cadc9012ea" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.015097 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07ffb6d953f869c63d66b38bc1b8dd4160d67531d04cab9e12a04cadc9012ea"} err="failed to get container status \"e07ffb6d953f869c63d66b38bc1b8dd4160d67531d04cab9e12a04cadc9012ea\": rpc error: code = NotFound desc = could not find container \"e07ffb6d953f869c63d66b38bc1b8dd4160d67531d04cab9e12a04cadc9012ea\": container with ID starting with e07ffb6d953f869c63d66b38bc1b8dd4160d67531d04cab9e12a04cadc9012ea not found: ID does not exist" Oct 04 02:58:33 crc kubenswrapper[4964]: W1004 02:58:33.032517 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b760ccf_f6fc_4396_8504_38e08c6d1737.slice/crio-240379a666066105de81f2a7fee09fd9b0e6f9804600754cd20096d3c2823054 WatchSource:0}: Error finding container 240379a666066105de81f2a7fee09fd9b0e6f9804600754cd20096d3c2823054: Status 404 returned error can't find the container with id 240379a666066105de81f2a7fee09fd9b0e6f9804600754cd20096d3c2823054 Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.386706 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.396887 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.406391 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:58:33 crc kubenswrapper[4964]: E1004 02:58:33.406806 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f3c888-5938-4862-9f0c-3b7b86566f84" containerName="nova-scheduler-scheduler" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.406825 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f3c888-5938-4862-9f0c-3b7b86566f84" containerName="nova-scheduler-scheduler" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.406993 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f3c888-5938-4862-9f0c-3b7b86566f84" containerName="nova-scheduler-scheduler" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.407570 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.409989 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.427643 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.514849 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec06312-bef7-494b-95d0-0f15ee8860f8-config-data\") pod \"nova-scheduler-0\" (UID: \"dec06312-bef7-494b-95d0-0f15ee8860f8\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.514922 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlvxv\" (UniqueName: \"kubernetes.io/projected/dec06312-bef7-494b-95d0-0f15ee8860f8-kube-api-access-dlvxv\") pod \"nova-scheduler-0\" (UID: \"dec06312-bef7-494b-95d0-0f15ee8860f8\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.515037 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec06312-bef7-494b-95d0-0f15ee8860f8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dec06312-bef7-494b-95d0-0f15ee8860f8\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.616636 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec06312-bef7-494b-95d0-0f15ee8860f8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dec06312-bef7-494b-95d0-0f15ee8860f8\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.616749 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec06312-bef7-494b-95d0-0f15ee8860f8-config-data\") pod \"nova-scheduler-0\" (UID: \"dec06312-bef7-494b-95d0-0f15ee8860f8\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.616793 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlvxv\" (UniqueName: \"kubernetes.io/projected/dec06312-bef7-494b-95d0-0f15ee8860f8-kube-api-access-dlvxv\") pod \"nova-scheduler-0\" (UID: \"dec06312-bef7-494b-95d0-0f15ee8860f8\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.622360 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec06312-bef7-494b-95d0-0f15ee8860f8-config-data\") pod \"nova-scheduler-0\" (UID: \"dec06312-bef7-494b-95d0-0f15ee8860f8\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.631415 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec06312-bef7-494b-95d0-0f15ee8860f8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dec06312-bef7-494b-95d0-0f15ee8860f8\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.634114 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlvxv\" (UniqueName: \"kubernetes.io/projected/dec06312-bef7-494b-95d0-0f15ee8860f8-kube-api-access-dlvxv\") pod \"nova-scheduler-0\" (UID: \"dec06312-bef7-494b-95d0-0f15ee8860f8\") " pod="openstack/nova-scheduler-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.739165 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.920071 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.965448 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1b760ccf-f6fc-4396-8504-38e08c6d1737","Type":"ContainerStarted","Data":"949f83a961d9709e22e1a92f52e47bf1faecd788fa008e38d57c33a967456f4c"} Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.965486 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1b760ccf-f6fc-4396-8504-38e08c6d1737","Type":"ContainerStarted","Data":"240379a666066105de81f2a7fee09fd9b0e6f9804600754cd20096d3c2823054"} Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.966097 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.971453 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6c36db3-d1d2-4130-b63f-6c94281b8bc8","Type":"ContainerStarted","Data":"e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361"} Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.971480 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6c36db3-d1d2-4130-b63f-6c94281b8bc8","Type":"ContainerStarted","Data":"2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628"} Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.971491 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6c36db3-d1d2-4130-b63f-6c94281b8bc8","Type":"ContainerStarted","Data":"4cccc788089520ca65f5ddfccb0531141ea5df7be76e03446fd4ee0d1300c375"} Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.973148 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa","Type":"ContainerStarted","Data":"d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b"} Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.973172 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa","Type":"ContainerStarted","Data":"8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1"} Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.973180 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa","Type":"ContainerStarted","Data":"cf2a48cf5e1b7935a7fb55c2384162c67fb730da5a8d3344021d1cc5222f523d"} Oct 04 02:58:33 crc kubenswrapper[4964]: I1004 02:58:33.984740 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.984720213 podStartE2EDuration="2.984720213s" podCreationTimestamp="2025-10-04 02:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:58:33.982502225 +0000 UTC m=+1093.879460863" watchObservedRunningTime="2025-10-04 02:58:33.984720213 +0000 UTC m=+1093.881678851" Oct 04 02:58:34 crc kubenswrapper[4964]: I1004 02:58:34.006085 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.006067854 podStartE2EDuration="2.006067854s" podCreationTimestamp="2025-10-04 02:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:58:33.998530302 +0000 UTC m=+1093.895488940" watchObservedRunningTime="2025-10-04 02:58:34.006067854 +0000 UTC m=+1093.903026492" Oct 04 02:58:34 crc kubenswrapper[4964]: I1004 02:58:34.017789 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.017770317 podStartE2EDuration="2.017770317s" podCreationTimestamp="2025-10-04 02:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:58:34.015310641 +0000 UTC m=+1093.912269299" watchObservedRunningTime="2025-10-04 02:58:34.017770317 +0000 UTC m=+1093.914728955" Oct 04 02:58:34 crc kubenswrapper[4964]: I1004 02:58:34.163276 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:58:34 crc kubenswrapper[4964]: W1004 02:58:34.166653 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddec06312_bef7_494b_95d0_0f15ee8860f8.slice/crio-422580751fdc584f293c17dfe33473be8c0b9442b685360345499caeb0920343 WatchSource:0}: Error finding container 422580751fdc584f293c17dfe33473be8c0b9442b685360345499caeb0920343: Status 404 returned error can't find the container with id 422580751fdc584f293c17dfe33473be8c0b9442b685360345499caeb0920343 Oct 04 02:58:34 crc kubenswrapper[4964]: I1004 02:58:34.449149 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:58:34 crc kubenswrapper[4964]: I1004 02:58:34.449691 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:58:34 crc kubenswrapper[4964]: I1004 02:58:34.854740 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="697e704a-98d7-40a5-b685-d46635ebe33d" path="/var/lib/kubelet/pods/697e704a-98d7-40a5-b685-d46635ebe33d/volumes" Oct 04 02:58:34 crc kubenswrapper[4964]: I1004 02:58:34.855555 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8f3c888-5938-4862-9f0c-3b7b86566f84" path="/var/lib/kubelet/pods/c8f3c888-5938-4862-9f0c-3b7b86566f84/volumes" Oct 04 02:58:34 crc kubenswrapper[4964]: I1004 02:58:34.984280 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dec06312-bef7-494b-95d0-0f15ee8860f8","Type":"ContainerStarted","Data":"394a806a25f046df7a516c452037f8349f85126756ea0422e416f177dcf4e917"} Oct 04 02:58:34 crc kubenswrapper[4964]: I1004 02:58:34.984331 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dec06312-bef7-494b-95d0-0f15ee8860f8","Type":"ContainerStarted","Data":"422580751fdc584f293c17dfe33473be8c0b9442b685360345499caeb0920343"} Oct 04 02:58:35 crc kubenswrapper[4964]: I1004 02:58:35.018077 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.018039025 podStartE2EDuration="2.018039025s" podCreationTimestamp="2025-10-04 02:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:58:35.004639637 +0000 UTC m=+1094.901598275" watchObservedRunningTime="2025-10-04 02:58:35.018039025 +0000 UTC m=+1094.914997703" Oct 04 02:58:37 crc kubenswrapper[4964]: I1004 02:58:37.548434 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 04 02:58:37 crc kubenswrapper[4964]: I1004 02:58:37.548836 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 04 02:58:38 crc kubenswrapper[4964]: I1004 02:58:38.739689 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 04 02:58:42 crc kubenswrapper[4964]: I1004 02:58:42.390734 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 04 02:58:42 crc kubenswrapper[4964]: I1004 02:58:42.541357 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 02:58:42 crc kubenswrapper[4964]: I1004 02:58:42.541747 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 02:58:42 crc kubenswrapper[4964]: I1004 02:58:42.547954 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 04 02:58:42 crc kubenswrapper[4964]: I1004 02:58:42.548026 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 04 02:58:43 crc kubenswrapper[4964]: I1004 02:58:43.639769 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 02:58:43 crc kubenswrapper[4964]: I1004 02:58:43.639838 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.181:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 02:58:43 crc kubenswrapper[4964]: I1004 02:58:43.639950 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 02:58:43 crc kubenswrapper[4964]: I1004 02:58:43.639996 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.181:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 02:58:43 crc kubenswrapper[4964]: I1004 02:58:43.740296 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 04 02:58:43 crc kubenswrapper[4964]: I1004 02:58:43.786369 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 04 02:58:44 crc kubenswrapper[4964]: I1004 02:58:44.123376 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 04 02:58:52 crc kubenswrapper[4964]: I1004 02:58:52.549644 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 04 02:58:52 crc kubenswrapper[4964]: I1004 02:58:52.550868 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 04 02:58:52 crc kubenswrapper[4964]: I1004 02:58:52.554056 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 04 02:58:52 crc kubenswrapper[4964]: I1004 02:58:52.559334 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 04 02:58:52 crc kubenswrapper[4964]: I1004 02:58:52.560015 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 04 02:58:52 crc kubenswrapper[4964]: I1004 02:58:52.571787 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 04 02:58:52 crc kubenswrapper[4964]: I1004 02:58:52.578645 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.212090 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.217956 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.219260 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.450024 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-nwzrj"] Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.455064 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.463312 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-nwzrj"] Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.574719 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-config\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.574778 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4psv\" (UniqueName: \"kubernetes.io/projected/03b93a62-2bba-402a-9dcb-d6b501af6c4b-kube-api-access-d4psv\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.574851 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.574905 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-dns-svc\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.574997 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.676295 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.676356 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-dns-svc\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.676463 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.676490 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-config\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.676536 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4psv\" (UniqueName: \"kubernetes.io/projected/03b93a62-2bba-402a-9dcb-d6b501af6c4b-kube-api-access-d4psv\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.677275 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-dns-svc\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.677471 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.677690 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.677854 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-config\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.700794 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4psv\" (UniqueName: \"kubernetes.io/projected/03b93a62-2bba-402a-9dcb-d6b501af6c4b-kube-api-access-d4psv\") pod \"dnsmasq-dns-5b856c5697-nwzrj\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:53 crc kubenswrapper[4964]: I1004 02:58:53.781290 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:54 crc kubenswrapper[4964]: I1004 02:58:54.228053 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-nwzrj"] Oct 04 02:58:55 crc kubenswrapper[4964]: I1004 02:58:55.230482 4964 generic.go:334] "Generic (PLEG): container finished" podID="03b93a62-2bba-402a-9dcb-d6b501af6c4b" containerID="e8d461c3ba423a9945400a3dfb634fe761490eb93da2e405c8b100208e551f0a" exitCode=0 Oct 04 02:58:55 crc kubenswrapper[4964]: I1004 02:58:55.230569 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" event={"ID":"03b93a62-2bba-402a-9dcb-d6b501af6c4b","Type":"ContainerDied","Data":"e8d461c3ba423a9945400a3dfb634fe761490eb93da2e405c8b100208e551f0a"} Oct 04 02:58:55 crc kubenswrapper[4964]: I1004 02:58:55.235013 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" event={"ID":"03b93a62-2bba-402a-9dcb-d6b501af6c4b","Type":"ContainerStarted","Data":"f9977feb2ae54e71a3fe1757f3a1a693636ef7562207fd919af6f9d232233efa"} Oct 04 02:58:55 crc kubenswrapper[4964]: I1004 02:58:55.529882 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:58:55 crc kubenswrapper[4964]: I1004 02:58:55.550255 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:58:55 crc kubenswrapper[4964]: I1004 02:58:55.550812 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="ceilometer-central-agent" containerID="cri-o://2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23" gracePeriod=30 Oct 04 02:58:55 crc kubenswrapper[4964]: I1004 02:58:55.550855 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="proxy-httpd" containerID="cri-o://13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275" gracePeriod=30 Oct 04 02:58:55 crc kubenswrapper[4964]: I1004 02:58:55.551215 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="sg-core" containerID="cri-o://423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424" gracePeriod=30 Oct 04 02:58:55 crc kubenswrapper[4964]: I1004 02:58:55.551357 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="ceilometer-notification-agent" containerID="cri-o://fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c" gracePeriod=30 Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.249714 4964 generic.go:334] "Generic (PLEG): container finished" podID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerID="13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275" exitCode=0 Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.249951 4964 generic.go:334] "Generic (PLEG): container finished" podID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerID="423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424" exitCode=2 Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.249960 4964 generic.go:334] "Generic (PLEG): container finished" podID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerID="2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23" exitCode=0 Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.250003 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb1307c-b75c-4636-a08b-b6be0eec41cd","Type":"ContainerDied","Data":"13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275"} Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.250029 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb1307c-b75c-4636-a08b-b6be0eec41cd","Type":"ContainerDied","Data":"423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424"} Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.250039 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb1307c-b75c-4636-a08b-b6be0eec41cd","Type":"ContainerDied","Data":"2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23"} Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.251366 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" event={"ID":"03b93a62-2bba-402a-9dcb-d6b501af6c4b","Type":"ContainerStarted","Data":"fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0"} Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.252531 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.254878 4964 generic.go:334] "Generic (PLEG): container finished" podID="8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5" containerID="8f0d133e400581fc931972f43c7d0764d9d640fac6970ab9e6efc9740254826f" exitCode=137 Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.255304 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" containerName="nova-api-log" containerID="cri-o://2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628" gracePeriod=30 Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.255470 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5","Type":"ContainerDied","Data":"8f0d133e400581fc931972f43c7d0764d9d640fac6970ab9e6efc9740254826f"} Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.255584 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5","Type":"ContainerDied","Data":"219992e9cec7463418042e4278814791a8bea32c85bffe9fb5a68179a52e341a"} Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.255706 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="219992e9cec7463418042e4278814791a8bea32c85bffe9fb5a68179a52e341a" Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.255760 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" containerName="nova-api-api" containerID="cri-o://e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361" gracePeriod=30 Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.279468 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" podStartSLOduration=3.279449433 podStartE2EDuration="3.279449433s" podCreationTimestamp="2025-10-04 02:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:58:56.275109866 +0000 UTC m=+1116.172068504" watchObservedRunningTime="2025-10-04 02:58:56.279449433 +0000 UTC m=+1116.176408061" Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.298905 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.439121 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpg4r\" (UniqueName: \"kubernetes.io/projected/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-kube-api-access-zpg4r\") pod \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\" (UID: \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\") " Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.439212 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-config-data\") pod \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\" (UID: \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\") " Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.439281 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-combined-ca-bundle\") pod \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\" (UID: \"8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5\") " Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.446259 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-kube-api-access-zpg4r" (OuterVolumeSpecName: "kube-api-access-zpg4r") pod "8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5" (UID: "8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5"). InnerVolumeSpecName "kube-api-access-zpg4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.469134 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5" (UID: "8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.485188 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-config-data" (OuterVolumeSpecName: "config-data") pod "8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5" (UID: "8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.541906 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpg4r\" (UniqueName: \"kubernetes.io/projected/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-kube-api-access-zpg4r\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.541944 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:56 crc kubenswrapper[4964]: I1004 02:58:56.541953 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.268233 4964 generic.go:334] "Generic (PLEG): container finished" podID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" containerID="2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628" exitCode=143 Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.268286 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6c36db3-d1d2-4130-b63f-6c94281b8bc8","Type":"ContainerDied","Data":"2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628"} Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.269232 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.306382 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.325199 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.334648 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 02:58:57 crc kubenswrapper[4964]: E1004 02:58:57.335317 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5" containerName="nova-cell1-novncproxy-novncproxy" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.335346 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5" containerName="nova-cell1-novncproxy-novncproxy" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.335674 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5" containerName="nova-cell1-novncproxy-novncproxy" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.336752 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.342706 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.342856 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.343195 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.345358 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.463193 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e392b9b2-79f9-4814-8dc6-2966cae7a018-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.463291 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e392b9b2-79f9-4814-8dc6-2966cae7a018-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.463332 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e392b9b2-79f9-4814-8dc6-2966cae7a018-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.463359 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e392b9b2-79f9-4814-8dc6-2966cae7a018-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.463400 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqvrk\" (UniqueName: \"kubernetes.io/projected/e392b9b2-79f9-4814-8dc6-2966cae7a018-kube-api-access-gqvrk\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.565731 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e392b9b2-79f9-4814-8dc6-2966cae7a018-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.565873 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e392b9b2-79f9-4814-8dc6-2966cae7a018-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.565951 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e392b9b2-79f9-4814-8dc6-2966cae7a018-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.565995 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e392b9b2-79f9-4814-8dc6-2966cae7a018-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.566054 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqvrk\" (UniqueName: \"kubernetes.io/projected/e392b9b2-79f9-4814-8dc6-2966cae7a018-kube-api-access-gqvrk\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.573170 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e392b9b2-79f9-4814-8dc6-2966cae7a018-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.573322 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e392b9b2-79f9-4814-8dc6-2966cae7a018-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.573261 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e392b9b2-79f9-4814-8dc6-2966cae7a018-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.573255 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e392b9b2-79f9-4814-8dc6-2966cae7a018-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.582344 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqvrk\" (UniqueName: \"kubernetes.io/projected/e392b9b2-79f9-4814-8dc6-2966cae7a018-kube-api-access-gqvrk\") pod \"nova-cell1-novncproxy-0\" (UID: \"e392b9b2-79f9-4814-8dc6-2966cae7a018\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.668707 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:58:57 crc kubenswrapper[4964]: W1004 02:58:57.949923 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode392b9b2_79f9_4814_8dc6_2966cae7a018.slice/crio-76b05ff12b896263f3a181a8a60aafd56b124ddedd5f246b99c70145b5417255 WatchSource:0}: Error finding container 76b05ff12b896263f3a181a8a60aafd56b124ddedd5f246b99c70145b5417255: Status 404 returned error can't find the container with id 76b05ff12b896263f3a181a8a60aafd56b124ddedd5f246b99c70145b5417255 Oct 04 02:58:57 crc kubenswrapper[4964]: I1004 02:58:57.958942 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 02:58:58 crc kubenswrapper[4964]: I1004 02:58:58.278495 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e392b9b2-79f9-4814-8dc6-2966cae7a018","Type":"ContainerStarted","Data":"7afb66a6ab82626bf8fcbc34b6725eb996e9508ff3225bc2a00eb7f30e4b0646"} Oct 04 02:58:58 crc kubenswrapper[4964]: I1004 02:58:58.278900 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e392b9b2-79f9-4814-8dc6-2966cae7a018","Type":"ContainerStarted","Data":"76b05ff12b896263f3a181a8a60aafd56b124ddedd5f246b99c70145b5417255"} Oct 04 02:58:58 crc kubenswrapper[4964]: I1004 02:58:58.298941 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.298919744 podStartE2EDuration="1.298919744s" podCreationTimestamp="2025-10-04 02:58:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:58:58.294974777 +0000 UTC m=+1118.191933425" watchObservedRunningTime="2025-10-04 02:58:58.298919744 +0000 UTC m=+1118.195878392" Oct 04 02:58:58 crc kubenswrapper[4964]: I1004 02:58:58.869543 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5" path="/var/lib/kubelet/pods/8b920d92-b8c0-4c64-9d76-b0f0c4c8c4a5/volumes" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.129996 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.289266 4964 generic.go:334] "Generic (PLEG): container finished" podID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerID="fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c" exitCode=0 Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.289332 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb1307c-b75c-4636-a08b-b6be0eec41cd","Type":"ContainerDied","Data":"fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c"} Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.289374 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb1307c-b75c-4636-a08b-b6be0eec41cd","Type":"ContainerDied","Data":"6ba14fb5358d84eb29866e2f70948b88f51986cbbc631fc40e8f34eae44f666e"} Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.289392 4964 scope.go:117] "RemoveContainer" containerID="13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.289428 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.309542 4964 scope.go:117] "RemoveContainer" containerID="423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.311193 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-scripts\") pod \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.311343 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-ceilometer-tls-certs\") pod \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.311543 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb1307c-b75c-4636-a08b-b6be0eec41cd-run-httpd\") pod \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.311730 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb1307c-b75c-4636-a08b-b6be0eec41cd-log-httpd\") pod \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.312488 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-sg-core-conf-yaml\") pod \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.312958 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p556c\" (UniqueName: \"kubernetes.io/projected/2bb1307c-b75c-4636-a08b-b6be0eec41cd-kube-api-access-p556c\") pod \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.313239 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-config-data\") pod \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.313374 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-combined-ca-bundle\") pod \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\" (UID: \"2bb1307c-b75c-4636-a08b-b6be0eec41cd\") " Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.311998 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb1307c-b75c-4636-a08b-b6be0eec41cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2bb1307c-b75c-4636-a08b-b6be0eec41cd" (UID: "2bb1307c-b75c-4636-a08b-b6be0eec41cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.312396 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb1307c-b75c-4636-a08b-b6be0eec41cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2bb1307c-b75c-4636-a08b-b6be0eec41cd" (UID: "2bb1307c-b75c-4636-a08b-b6be0eec41cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.314297 4964 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb1307c-b75c-4636-a08b-b6be0eec41cd-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.314477 4964 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb1307c-b75c-4636-a08b-b6be0eec41cd-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.318585 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-scripts" (OuterVolumeSpecName: "scripts") pod "2bb1307c-b75c-4636-a08b-b6be0eec41cd" (UID: "2bb1307c-b75c-4636-a08b-b6be0eec41cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.324711 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb1307c-b75c-4636-a08b-b6be0eec41cd-kube-api-access-p556c" (OuterVolumeSpecName: "kube-api-access-p556c") pod "2bb1307c-b75c-4636-a08b-b6be0eec41cd" (UID: "2bb1307c-b75c-4636-a08b-b6be0eec41cd"). InnerVolumeSpecName "kube-api-access-p556c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.346383 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2bb1307c-b75c-4636-a08b-b6be0eec41cd" (UID: "2bb1307c-b75c-4636-a08b-b6be0eec41cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.380280 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2bb1307c-b75c-4636-a08b-b6be0eec41cd" (UID: "2bb1307c-b75c-4636-a08b-b6be0eec41cd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.420317 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.420363 4964 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.420382 4964 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.420578 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p556c\" (UniqueName: \"kubernetes.io/projected/2bb1307c-b75c-4636-a08b-b6be0eec41cd-kube-api-access-p556c\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.434638 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-config-data" (OuterVolumeSpecName: "config-data") pod "2bb1307c-b75c-4636-a08b-b6be0eec41cd" (UID: "2bb1307c-b75c-4636-a08b-b6be0eec41cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.439448 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bb1307c-b75c-4636-a08b-b6be0eec41cd" (UID: "2bb1307c-b75c-4636-a08b-b6be0eec41cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.466469 4964 scope.go:117] "RemoveContainer" containerID="fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.506112 4964 scope.go:117] "RemoveContainer" containerID="2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.522373 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.522418 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb1307c-b75c-4636-a08b-b6be0eec41cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.536452 4964 scope.go:117] "RemoveContainer" containerID="13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275" Oct 04 02:58:59 crc kubenswrapper[4964]: E1004 02:58:59.536960 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275\": container with ID starting with 13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275 not found: ID does not exist" containerID="13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.536988 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275"} err="failed to get container status \"13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275\": rpc error: code = NotFound desc = could not find container \"13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275\": container with ID starting with 13cdba0ef215d4cde02983e0b02e0dda19fc6e4a88e302d182a150ba9ce87275 not found: ID does not exist" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.537009 4964 scope.go:117] "RemoveContainer" containerID="423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424" Oct 04 02:58:59 crc kubenswrapper[4964]: E1004 02:58:59.537299 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424\": container with ID starting with 423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424 not found: ID does not exist" containerID="423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.537342 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424"} err="failed to get container status \"423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424\": rpc error: code = NotFound desc = could not find container \"423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424\": container with ID starting with 423ab058eab13a173490af5786bb7076721e6c3d05cde6474b2a812c54fd2424 not found: ID does not exist" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.537368 4964 scope.go:117] "RemoveContainer" containerID="fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c" Oct 04 02:58:59 crc kubenswrapper[4964]: E1004 02:58:59.537797 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c\": container with ID starting with fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c not found: ID does not exist" containerID="fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.537818 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c"} err="failed to get container status \"fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c\": rpc error: code = NotFound desc = could not find container \"fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c\": container with ID starting with fe96a7a19086378fa74f9cfac1ecf10898cbe088f52ba45707fee2ebd921069c not found: ID does not exist" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.537831 4964 scope.go:117] "RemoveContainer" containerID="2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23" Oct 04 02:58:59 crc kubenswrapper[4964]: E1004 02:58:59.538431 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23\": container with ID starting with 2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23 not found: ID does not exist" containerID="2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.538453 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23"} err="failed to get container status \"2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23\": rpc error: code = NotFound desc = could not find container \"2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23\": container with ID starting with 2b53a5c9caf49fdaa7d0c279a279de2852f91b876442a51f8efb1701195e3f23 not found: ID does not exist" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.652534 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.669036 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.680367 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:58:59 crc kubenswrapper[4964]: E1004 02:58:59.680782 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="ceilometer-central-agent" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.680802 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="ceilometer-central-agent" Oct 04 02:58:59 crc kubenswrapper[4964]: E1004 02:58:59.680816 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="proxy-httpd" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.680823 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="proxy-httpd" Oct 04 02:58:59 crc kubenswrapper[4964]: E1004 02:58:59.680835 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="ceilometer-notification-agent" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.680842 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="ceilometer-notification-agent" Oct 04 02:58:59 crc kubenswrapper[4964]: E1004 02:58:59.680874 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="sg-core" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.680880 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="sg-core" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.681030 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="ceilometer-central-agent" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.681053 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="ceilometer-notification-agent" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.681066 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="sg-core" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.681078 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" containerName="proxy-httpd" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.682650 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.687366 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.687516 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.687623 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.689893 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.827597 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf98s\" (UniqueName: \"kubernetes.io/projected/69b09710-8018-47e1-9d2e-36df63451268-kube-api-access-pf98s\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.827948 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-scripts\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.828004 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-config-data\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.828019 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b09710-8018-47e1-9d2e-36df63451268-log-httpd\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.828060 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.828117 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.828134 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b09710-8018-47e1-9d2e-36df63451268-run-httpd\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.828171 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.929892 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.929992 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.930013 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b09710-8018-47e1-9d2e-36df63451268-run-httpd\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.930054 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.930083 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf98s\" (UniqueName: \"kubernetes.io/projected/69b09710-8018-47e1-9d2e-36df63451268-kube-api-access-pf98s\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.930099 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-scripts\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.930146 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b09710-8018-47e1-9d2e-36df63451268-log-httpd\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.930165 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-config-data\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.930863 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b09710-8018-47e1-9d2e-36df63451268-run-httpd\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.931439 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b09710-8018-47e1-9d2e-36df63451268-log-httpd\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.937052 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.937124 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-scripts\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.944408 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.944751 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-config-data\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.945489 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:58:59 crc kubenswrapper[4964]: I1004 02:58:59.956273 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf98s\" (UniqueName: \"kubernetes.io/projected/69b09710-8018-47e1-9d2e-36df63451268-kube-api-access-pf98s\") pod \"ceilometer-0\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " pod="openstack/ceilometer-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.025472 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.095453 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.235921 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-logs\") pod \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.236764 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcx8j\" (UniqueName: \"kubernetes.io/projected/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-kube-api-access-hcx8j\") pod \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.236782 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-logs" (OuterVolumeSpecName: "logs") pod "e6c36db3-d1d2-4130-b63f-6c94281b8bc8" (UID: "e6c36db3-d1d2-4130-b63f-6c94281b8bc8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.236819 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-combined-ca-bundle\") pod \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.236852 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-config-data\") pod \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\" (UID: \"e6c36db3-d1d2-4130-b63f-6c94281b8bc8\") " Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.238054 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-logs\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.243017 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-kube-api-access-hcx8j" (OuterVolumeSpecName: "kube-api-access-hcx8j") pod "e6c36db3-d1d2-4130-b63f-6c94281b8bc8" (UID: "e6c36db3-d1d2-4130-b63f-6c94281b8bc8"). InnerVolumeSpecName "kube-api-access-hcx8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.267743 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-config-data" (OuterVolumeSpecName: "config-data") pod "e6c36db3-d1d2-4130-b63f-6c94281b8bc8" (UID: "e6c36db3-d1d2-4130-b63f-6c94281b8bc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.271233 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6c36db3-d1d2-4130-b63f-6c94281b8bc8" (UID: "e6c36db3-d1d2-4130-b63f-6c94281b8bc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.298863 4964 generic.go:334] "Generic (PLEG): container finished" podID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" containerID="e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361" exitCode=0 Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.298897 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.298952 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6c36db3-d1d2-4130-b63f-6c94281b8bc8","Type":"ContainerDied","Data":"e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361"} Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.299079 4964 scope.go:117] "RemoveContainer" containerID="e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.298986 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6c36db3-d1d2-4130-b63f-6c94281b8bc8","Type":"ContainerDied","Data":"4cccc788089520ca65f5ddfccb0531141ea5df7be76e03446fd4ee0d1300c375"} Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.338667 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.339926 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcx8j\" (UniqueName: \"kubernetes.io/projected/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-kube-api-access-hcx8j\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.339953 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.339966 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6c36db3-d1d2-4130-b63f-6c94281b8bc8-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.346590 4964 scope.go:117] "RemoveContainer" containerID="2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.356105 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.381832 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 04 02:59:00 crc kubenswrapper[4964]: E1004 02:59:00.382469 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" containerName="nova-api-log" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.382493 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" containerName="nova-api-log" Oct 04 02:59:00 crc kubenswrapper[4964]: E1004 02:59:00.382561 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" containerName="nova-api-api" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.382573 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" containerName="nova-api-api" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.382924 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" containerName="nova-api-log" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.382970 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" containerName="nova-api-api" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.384421 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.386938 4964 scope.go:117] "RemoveContainer" containerID="e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.387380 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.387554 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.387916 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 04 02:59:00 crc kubenswrapper[4964]: E1004 02:59:00.388008 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361\": container with ID starting with e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361 not found: ID does not exist" containerID="e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.388054 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361"} err="failed to get container status \"e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361\": rpc error: code = NotFound desc = could not find container \"e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361\": container with ID starting with e3520055b6b1c2ddfef121f6b5c7d263ea10bfdefd3b5dead486de907ce91361 not found: ID does not exist" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.388838 4964 scope.go:117] "RemoveContainer" containerID="2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628" Oct 04 02:59:00 crc kubenswrapper[4964]: E1004 02:59:00.394828 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628\": container with ID starting with 2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628 not found: ID does not exist" containerID="2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.394869 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628"} err="failed to get container status \"2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628\": rpc error: code = NotFound desc = could not find container \"2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628\": container with ID starting with 2bb996504a950d4b69f6f0e7d9c48cf29ad063462a3b1e5970068f87e0012628 not found: ID does not exist" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.399538 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.542882 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-config-data\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.542989 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.543047 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhprk\" (UniqueName: \"kubernetes.io/projected/88896418-ceba-4933-90f2-cca7dcb2d0be-kube-api-access-lhprk\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.543067 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-public-tls-certs\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.543083 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-internal-tls-certs\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.543124 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88896418-ceba-4933-90f2-cca7dcb2d0be-logs\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.567173 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 02:59:00 crc kubenswrapper[4964]: W1004 02:59:00.573212 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69b09710_8018_47e1_9d2e_36df63451268.slice/crio-b971ffdef91e04f926fbbb7504b40e58d0a2c82ebfac494573bcf6ede57022b0 WatchSource:0}: Error finding container b971ffdef91e04f926fbbb7504b40e58d0a2c82ebfac494573bcf6ede57022b0: Status 404 returned error can't find the container with id b971ffdef91e04f926fbbb7504b40e58d0a2c82ebfac494573bcf6ede57022b0 Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.645107 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.645263 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhprk\" (UniqueName: \"kubernetes.io/projected/88896418-ceba-4933-90f2-cca7dcb2d0be-kube-api-access-lhprk\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.645305 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-public-tls-certs\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.645335 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-internal-tls-certs\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.645399 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88896418-ceba-4933-90f2-cca7dcb2d0be-logs\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.645471 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-config-data\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.646959 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88896418-ceba-4933-90f2-cca7dcb2d0be-logs\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.650836 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-internal-tls-certs\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.651807 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-public-tls-certs\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.652131 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.652256 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-config-data\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.675686 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhprk\" (UniqueName: \"kubernetes.io/projected/88896418-ceba-4933-90f2-cca7dcb2d0be-kube-api-access-lhprk\") pod \"nova-api-0\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.712999 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.865277 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb1307c-b75c-4636-a08b-b6be0eec41cd" path="/var/lib/kubelet/pods/2bb1307c-b75c-4636-a08b-b6be0eec41cd/volumes" Oct 04 02:59:00 crc kubenswrapper[4964]: I1004 02:59:00.866799 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6c36db3-d1d2-4130-b63f-6c94281b8bc8" path="/var/lib/kubelet/pods/e6c36db3-d1d2-4130-b63f-6c94281b8bc8/volumes" Oct 04 02:59:01 crc kubenswrapper[4964]: I1004 02:59:01.203670 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:59:01 crc kubenswrapper[4964]: I1004 02:59:01.315433 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69b09710-8018-47e1-9d2e-36df63451268","Type":"ContainerStarted","Data":"b971ffdef91e04f926fbbb7504b40e58d0a2c82ebfac494573bcf6ede57022b0"} Oct 04 02:59:01 crc kubenswrapper[4964]: I1004 02:59:01.316454 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88896418-ceba-4933-90f2-cca7dcb2d0be","Type":"ContainerStarted","Data":"6db6e9eebb2d26deb5a85daaf1c5ab7fc8b9a7b4c78b7339425d62ab2a5ab9c3"} Oct 04 02:59:02 crc kubenswrapper[4964]: I1004 02:59:02.324824 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69b09710-8018-47e1-9d2e-36df63451268","Type":"ContainerStarted","Data":"ebd282f42eb883e68535e463ecfb15bc70d23b212b1d750db8bb89675e00b115"} Oct 04 02:59:02 crc kubenswrapper[4964]: I1004 02:59:02.325075 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69b09710-8018-47e1-9d2e-36df63451268","Type":"ContainerStarted","Data":"b7423db947cba0f404cde65df2af840f7f0bee247ed7824d2aaa9a946367c368"} Oct 04 02:59:02 crc kubenswrapper[4964]: I1004 02:59:02.327137 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88896418-ceba-4933-90f2-cca7dcb2d0be","Type":"ContainerStarted","Data":"4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b"} Oct 04 02:59:02 crc kubenswrapper[4964]: I1004 02:59:02.327170 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88896418-ceba-4933-90f2-cca7dcb2d0be","Type":"ContainerStarted","Data":"117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784"} Oct 04 02:59:02 crc kubenswrapper[4964]: I1004 02:59:02.364229 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.364204232 podStartE2EDuration="2.364204232s" podCreationTimestamp="2025-10-04 02:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:59:02.357400639 +0000 UTC m=+1122.254359347" watchObservedRunningTime="2025-10-04 02:59:02.364204232 +0000 UTC m=+1122.261162880" Oct 04 02:59:02 crc kubenswrapper[4964]: I1004 02:59:02.669108 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:59:03 crc kubenswrapper[4964]: I1004 02:59:03.345400 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69b09710-8018-47e1-9d2e-36df63451268","Type":"ContainerStarted","Data":"beda92b6737d95754b37686a36cd2441d380f20fa24da514e8742cc4389d9723"} Oct 04 02:59:03 crc kubenswrapper[4964]: I1004 02:59:03.782908 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 02:59:03 crc kubenswrapper[4964]: I1004 02:59:03.850030 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-rvg6j"] Oct 04 02:59:03 crc kubenswrapper[4964]: I1004 02:59:03.850726 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" podUID="0100d1f7-03f9-4c09-a6df-4b30250ccfa4" containerName="dnsmasq-dns" containerID="cri-o://78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e" gracePeriod=10 Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.224175 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.346722 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-config\") pod \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.346799 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55vkg\" (UniqueName: \"kubernetes.io/projected/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-kube-api-access-55vkg\") pod \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.346852 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-ovsdbserver-nb\") pod \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.346887 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-dns-svc\") pod \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.346970 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-ovsdbserver-sb\") pod \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\" (UID: \"0100d1f7-03f9-4c09-a6df-4b30250ccfa4\") " Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.351776 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-kube-api-access-55vkg" (OuterVolumeSpecName: "kube-api-access-55vkg") pod "0100d1f7-03f9-4c09-a6df-4b30250ccfa4" (UID: "0100d1f7-03f9-4c09-a6df-4b30250ccfa4"). InnerVolumeSpecName "kube-api-access-55vkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.354051 4964 generic.go:334] "Generic (PLEG): container finished" podID="0100d1f7-03f9-4c09-a6df-4b30250ccfa4" containerID="78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e" exitCode=0 Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.354099 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" event={"ID":"0100d1f7-03f9-4c09-a6df-4b30250ccfa4","Type":"ContainerDied","Data":"78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e"} Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.354123 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" event={"ID":"0100d1f7-03f9-4c09-a6df-4b30250ccfa4","Type":"ContainerDied","Data":"a0d9730f923f98c0e363cf21a451eefbb038a74f34cfa86688589bf28fef8ea8"} Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.354138 4964 scope.go:117] "RemoveContainer" containerID="78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.354220 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-rvg6j" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.359067 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69b09710-8018-47e1-9d2e-36df63451268","Type":"ContainerStarted","Data":"577d8dac90a628c448abf855afcc52a97c16028ad831736c1f330c9ba30b49df"} Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.361351 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.377900 4964 scope.go:117] "RemoveContainer" containerID="3a47a534223a9e03e808c28d961d55dbd817912087ed88a5d7c930060c037a0c" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.393153 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.128332317 podStartE2EDuration="5.393130297s" podCreationTimestamp="2025-10-04 02:58:59 +0000 UTC" firstStartedPulling="2025-10-04 02:59:00.575949651 +0000 UTC m=+1120.472908289" lastFinishedPulling="2025-10-04 02:59:03.840747591 +0000 UTC m=+1123.737706269" observedRunningTime="2025-10-04 02:59:04.382195373 +0000 UTC m=+1124.279154021" watchObservedRunningTime="2025-10-04 02:59:04.393130297 +0000 UTC m=+1124.290088935" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.405088 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0100d1f7-03f9-4c09-a6df-4b30250ccfa4" (UID: "0100d1f7-03f9-4c09-a6df-4b30250ccfa4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.405118 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-config" (OuterVolumeSpecName: "config") pod "0100d1f7-03f9-4c09-a6df-4b30250ccfa4" (UID: "0100d1f7-03f9-4c09-a6df-4b30250ccfa4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.405545 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0100d1f7-03f9-4c09-a6df-4b30250ccfa4" (UID: "0100d1f7-03f9-4c09-a6df-4b30250ccfa4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.406271 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0100d1f7-03f9-4c09-a6df-4b30250ccfa4" (UID: "0100d1f7-03f9-4c09-a6df-4b30250ccfa4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.409851 4964 scope.go:117] "RemoveContainer" containerID="78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e" Oct 04 02:59:04 crc kubenswrapper[4964]: E1004 02:59:04.410271 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e\": container with ID starting with 78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e not found: ID does not exist" containerID="78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.410318 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e"} err="failed to get container status \"78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e\": rpc error: code = NotFound desc = could not find container \"78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e\": container with ID starting with 78dda55a03c1069a7ccb6083e8b394cec895e7638b9524bf54b916c003ac146e not found: ID does not exist" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.410347 4964 scope.go:117] "RemoveContainer" containerID="3a47a534223a9e03e808c28d961d55dbd817912087ed88a5d7c930060c037a0c" Oct 04 02:59:04 crc kubenswrapper[4964]: E1004 02:59:04.410738 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a47a534223a9e03e808c28d961d55dbd817912087ed88a5d7c930060c037a0c\": container with ID starting with 3a47a534223a9e03e808c28d961d55dbd817912087ed88a5d7c930060c037a0c not found: ID does not exist" containerID="3a47a534223a9e03e808c28d961d55dbd817912087ed88a5d7c930060c037a0c" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.410763 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a47a534223a9e03e808c28d961d55dbd817912087ed88a5d7c930060c037a0c"} err="failed to get container status \"3a47a534223a9e03e808c28d961d55dbd817912087ed88a5d7c930060c037a0c\": rpc error: code = NotFound desc = could not find container \"3a47a534223a9e03e808c28d961d55dbd817912087ed88a5d7c930060c037a0c\": container with ID starting with 3a47a534223a9e03e808c28d961d55dbd817912087ed88a5d7c930060c037a0c not found: ID does not exist" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.448653 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-config\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.448877 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55vkg\" (UniqueName: \"kubernetes.io/projected/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-kube-api-access-55vkg\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.448962 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.449042 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.449098 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0100d1f7-03f9-4c09-a6df-4b30250ccfa4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.448962 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.449234 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.692677 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-rvg6j"] Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.704471 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-rvg6j"] Oct 04 02:59:04 crc kubenswrapper[4964]: I1004 02:59:04.883596 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0100d1f7-03f9-4c09-a6df-4b30250ccfa4" path="/var/lib/kubelet/pods/0100d1f7-03f9-4c09-a6df-4b30250ccfa4/volumes" Oct 04 02:59:07 crc kubenswrapper[4964]: I1004 02:59:07.669551 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:59:07 crc kubenswrapper[4964]: I1004 02:59:07.702305 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.464496 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.666450 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ns9dn"] Oct 04 02:59:08 crc kubenswrapper[4964]: E1004 02:59:08.666858 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0100d1f7-03f9-4c09-a6df-4b30250ccfa4" containerName="dnsmasq-dns" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.666879 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="0100d1f7-03f9-4c09-a6df-4b30250ccfa4" containerName="dnsmasq-dns" Oct 04 02:59:08 crc kubenswrapper[4964]: E1004 02:59:08.666917 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0100d1f7-03f9-4c09-a6df-4b30250ccfa4" containerName="init" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.666925 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="0100d1f7-03f9-4c09-a6df-4b30250ccfa4" containerName="init" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.667125 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="0100d1f7-03f9-4c09-a6df-4b30250ccfa4" containerName="dnsmasq-dns" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.667821 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.671686 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.671835 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.682098 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ns9dn"] Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.856893 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-scripts\") pod \"nova-cell1-cell-mapping-ns9dn\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.856936 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-config-data\") pod \"nova-cell1-cell-mapping-ns9dn\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.856980 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ns9dn\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.857008 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjnmj\" (UniqueName: \"kubernetes.io/projected/edf8a38b-136c-4af9-b766-136bdfabd69d-kube-api-access-pjnmj\") pod \"nova-cell1-cell-mapping-ns9dn\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.959304 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-scripts\") pod \"nova-cell1-cell-mapping-ns9dn\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.959367 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-config-data\") pod \"nova-cell1-cell-mapping-ns9dn\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.959459 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ns9dn\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.959518 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjnmj\" (UniqueName: \"kubernetes.io/projected/edf8a38b-136c-4af9-b766-136bdfabd69d-kube-api-access-pjnmj\") pod \"nova-cell1-cell-mapping-ns9dn\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.970717 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-scripts\") pod \"nova-cell1-cell-mapping-ns9dn\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.971023 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ns9dn\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.971121 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-config-data\") pod \"nova-cell1-cell-mapping-ns9dn\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.989360 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjnmj\" (UniqueName: \"kubernetes.io/projected/edf8a38b-136c-4af9-b766-136bdfabd69d-kube-api-access-pjnmj\") pod \"nova-cell1-cell-mapping-ns9dn\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:08 crc kubenswrapper[4964]: I1004 02:59:08.997884 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:09 crc kubenswrapper[4964]: W1004 02:59:09.546663 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedf8a38b_136c_4af9_b766_136bdfabd69d.slice/crio-3ad7a56ec5f43d0876af595333715a9555cf4b4b4429d29a7e7ee97e39785fa7 WatchSource:0}: Error finding container 3ad7a56ec5f43d0876af595333715a9555cf4b4b4429d29a7e7ee97e39785fa7: Status 404 returned error can't find the container with id 3ad7a56ec5f43d0876af595333715a9555cf4b4b4429d29a7e7ee97e39785fa7 Oct 04 02:59:09 crc kubenswrapper[4964]: I1004 02:59:09.550411 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ns9dn"] Oct 04 02:59:10 crc kubenswrapper[4964]: I1004 02:59:10.453712 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ns9dn" event={"ID":"edf8a38b-136c-4af9-b766-136bdfabd69d","Type":"ContainerStarted","Data":"8f91cd4dd407ac9f844c11529b2191c1c27a813b20c8ef2c6a9b0037ef82dde7"} Oct 04 02:59:10 crc kubenswrapper[4964]: I1004 02:59:10.454143 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ns9dn" event={"ID":"edf8a38b-136c-4af9-b766-136bdfabd69d","Type":"ContainerStarted","Data":"3ad7a56ec5f43d0876af595333715a9555cf4b4b4429d29a7e7ee97e39785fa7"} Oct 04 02:59:10 crc kubenswrapper[4964]: I1004 02:59:10.484855 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-ns9dn" podStartSLOduration=2.484822693 podStartE2EDuration="2.484822693s" podCreationTimestamp="2025-10-04 02:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:59:10.482202673 +0000 UTC m=+1130.379161331" watchObservedRunningTime="2025-10-04 02:59:10.484822693 +0000 UTC m=+1130.381781371" Oct 04 02:59:10 crc kubenswrapper[4964]: I1004 02:59:10.714083 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 02:59:10 crc kubenswrapper[4964]: I1004 02:59:10.714157 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 02:59:11 crc kubenswrapper[4964]: I1004 02:59:11.736779 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="88896418-ceba-4933-90f2-cca7dcb2d0be" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 02:59:11 crc kubenswrapper[4964]: I1004 02:59:11.736780 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="88896418-ceba-4933-90f2-cca7dcb2d0be" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 02:59:14 crc kubenswrapper[4964]: I1004 02:59:14.503996 4964 generic.go:334] "Generic (PLEG): container finished" podID="edf8a38b-136c-4af9-b766-136bdfabd69d" containerID="8f91cd4dd407ac9f844c11529b2191c1c27a813b20c8ef2c6a9b0037ef82dde7" exitCode=0 Oct 04 02:59:14 crc kubenswrapper[4964]: I1004 02:59:14.504143 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ns9dn" event={"ID":"edf8a38b-136c-4af9-b766-136bdfabd69d","Type":"ContainerDied","Data":"8f91cd4dd407ac9f844c11529b2191c1c27a813b20c8ef2c6a9b0037ef82dde7"} Oct 04 02:59:15 crc kubenswrapper[4964]: I1004 02:59:15.986772 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.117054 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjnmj\" (UniqueName: \"kubernetes.io/projected/edf8a38b-136c-4af9-b766-136bdfabd69d-kube-api-access-pjnmj\") pod \"edf8a38b-136c-4af9-b766-136bdfabd69d\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.117128 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-config-data\") pod \"edf8a38b-136c-4af9-b766-136bdfabd69d\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.117182 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-combined-ca-bundle\") pod \"edf8a38b-136c-4af9-b766-136bdfabd69d\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.117250 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-scripts\") pod \"edf8a38b-136c-4af9-b766-136bdfabd69d\" (UID: \"edf8a38b-136c-4af9-b766-136bdfabd69d\") " Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.123057 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edf8a38b-136c-4af9-b766-136bdfabd69d-kube-api-access-pjnmj" (OuterVolumeSpecName: "kube-api-access-pjnmj") pod "edf8a38b-136c-4af9-b766-136bdfabd69d" (UID: "edf8a38b-136c-4af9-b766-136bdfabd69d"). InnerVolumeSpecName "kube-api-access-pjnmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.124358 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-scripts" (OuterVolumeSpecName: "scripts") pod "edf8a38b-136c-4af9-b766-136bdfabd69d" (UID: "edf8a38b-136c-4af9-b766-136bdfabd69d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.154720 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-config-data" (OuterVolumeSpecName: "config-data") pod "edf8a38b-136c-4af9-b766-136bdfabd69d" (UID: "edf8a38b-136c-4af9-b766-136bdfabd69d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.155129 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edf8a38b-136c-4af9-b766-136bdfabd69d" (UID: "edf8a38b-136c-4af9-b766-136bdfabd69d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.219829 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjnmj\" (UniqueName: \"kubernetes.io/projected/edf8a38b-136c-4af9-b766-136bdfabd69d-kube-api-access-pjnmj\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.219881 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.219901 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.219925 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf8a38b-136c-4af9-b766-136bdfabd69d-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.534506 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ns9dn" event={"ID":"edf8a38b-136c-4af9-b766-136bdfabd69d","Type":"ContainerDied","Data":"3ad7a56ec5f43d0876af595333715a9555cf4b4b4429d29a7e7ee97e39785fa7"} Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.535792 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ad7a56ec5f43d0876af595333715a9555cf4b4b4429d29a7e7ee97e39785fa7" Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.534599 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ns9dn" Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.742552 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.742848 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dec06312-bef7-494b-95d0-0f15ee8860f8" containerName="nova-scheduler-scheduler" containerID="cri-o://394a806a25f046df7a516c452037f8349f85126756ea0422e416f177dcf4e917" gracePeriod=30 Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.755241 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.755747 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="88896418-ceba-4933-90f2-cca7dcb2d0be" containerName="nova-api-api" containerID="cri-o://4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b" gracePeriod=30 Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.755554 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="88896418-ceba-4933-90f2-cca7dcb2d0be" containerName="nova-api-log" containerID="cri-o://117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784" gracePeriod=30 Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.774756 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.775261 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerName="nova-metadata-metadata" containerID="cri-o://d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b" gracePeriod=30 Oct 04 02:59:16 crc kubenswrapper[4964]: I1004 02:59:16.775092 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerName="nova-metadata-log" containerID="cri-o://8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1" gracePeriod=30 Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.558336 4964 generic.go:334] "Generic (PLEG): container finished" podID="88896418-ceba-4933-90f2-cca7dcb2d0be" containerID="117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784" exitCode=143 Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.558901 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88896418-ceba-4933-90f2-cca7dcb2d0be","Type":"ContainerDied","Data":"117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784"} Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.564450 4964 generic.go:334] "Generic (PLEG): container finished" podID="dec06312-bef7-494b-95d0-0f15ee8860f8" containerID="394a806a25f046df7a516c452037f8349f85126756ea0422e416f177dcf4e917" exitCode=0 Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.564554 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dec06312-bef7-494b-95d0-0f15ee8860f8","Type":"ContainerDied","Data":"394a806a25f046df7a516c452037f8349f85126756ea0422e416f177dcf4e917"} Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.569082 4964 generic.go:334] "Generic (PLEG): container finished" podID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerID="8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1" exitCode=143 Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.569131 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa","Type":"ContainerDied","Data":"8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1"} Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.639609 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.643043 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec06312-bef7-494b-95d0-0f15ee8860f8-config-data\") pod \"dec06312-bef7-494b-95d0-0f15ee8860f8\" (UID: \"dec06312-bef7-494b-95d0-0f15ee8860f8\") " Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.643245 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec06312-bef7-494b-95d0-0f15ee8860f8-combined-ca-bundle\") pod \"dec06312-bef7-494b-95d0-0f15ee8860f8\" (UID: \"dec06312-bef7-494b-95d0-0f15ee8860f8\") " Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.643313 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlvxv\" (UniqueName: \"kubernetes.io/projected/dec06312-bef7-494b-95d0-0f15ee8860f8-kube-api-access-dlvxv\") pod \"dec06312-bef7-494b-95d0-0f15ee8860f8\" (UID: \"dec06312-bef7-494b-95d0-0f15ee8860f8\") " Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.648651 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec06312-bef7-494b-95d0-0f15ee8860f8-kube-api-access-dlvxv" (OuterVolumeSpecName: "kube-api-access-dlvxv") pod "dec06312-bef7-494b-95d0-0f15ee8860f8" (UID: "dec06312-bef7-494b-95d0-0f15ee8860f8"). InnerVolumeSpecName "kube-api-access-dlvxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.690702 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec06312-bef7-494b-95d0-0f15ee8860f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dec06312-bef7-494b-95d0-0f15ee8860f8" (UID: "dec06312-bef7-494b-95d0-0f15ee8860f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.707371 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec06312-bef7-494b-95d0-0f15ee8860f8-config-data" (OuterVolumeSpecName: "config-data") pod "dec06312-bef7-494b-95d0-0f15ee8860f8" (UID: "dec06312-bef7-494b-95d0-0f15ee8860f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.745128 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec06312-bef7-494b-95d0-0f15ee8860f8-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.745157 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec06312-bef7-494b-95d0-0f15ee8860f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:17 crc kubenswrapper[4964]: I1004 02:59:17.745197 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlvxv\" (UniqueName: \"kubernetes.io/projected/dec06312-bef7-494b-95d0-0f15ee8860f8-kube-api-access-dlvxv\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.582126 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dec06312-bef7-494b-95d0-0f15ee8860f8","Type":"ContainerDied","Data":"422580751fdc584f293c17dfe33473be8c0b9442b685360345499caeb0920343"} Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.582260 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.582594 4964 scope.go:117] "RemoveContainer" containerID="394a806a25f046df7a516c452037f8349f85126756ea0422e416f177dcf4e917" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.636338 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.645064 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.663880 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:59:18 crc kubenswrapper[4964]: E1004 02:59:18.664482 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec06312-bef7-494b-95d0-0f15ee8860f8" containerName="nova-scheduler-scheduler" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.664529 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec06312-bef7-494b-95d0-0f15ee8860f8" containerName="nova-scheduler-scheduler" Oct 04 02:59:18 crc kubenswrapper[4964]: E1004 02:59:18.664556 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edf8a38b-136c-4af9-b766-136bdfabd69d" containerName="nova-manage" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.664565 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf8a38b-136c-4af9-b766-136bdfabd69d" containerName="nova-manage" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.664989 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec06312-bef7-494b-95d0-0f15ee8860f8" containerName="nova-scheduler-scheduler" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.665039 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="edf8a38b-136c-4af9-b766-136bdfabd69d" containerName="nova-manage" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.666080 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.669360 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.671588 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.704410 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4751bc-b2dd-4323-83d8-45f639f1a72a-config-data\") pod \"nova-scheduler-0\" (UID: \"cd4751bc-b2dd-4323-83d8-45f639f1a72a\") " pod="openstack/nova-scheduler-0" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.704479 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpx9t\" (UniqueName: \"kubernetes.io/projected/cd4751bc-b2dd-4323-83d8-45f639f1a72a-kube-api-access-wpx9t\") pod \"nova-scheduler-0\" (UID: \"cd4751bc-b2dd-4323-83d8-45f639f1a72a\") " pod="openstack/nova-scheduler-0" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.704599 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4751bc-b2dd-4323-83d8-45f639f1a72a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd4751bc-b2dd-4323-83d8-45f639f1a72a\") " pod="openstack/nova-scheduler-0" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.806779 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4751bc-b2dd-4323-83d8-45f639f1a72a-config-data\") pod \"nova-scheduler-0\" (UID: \"cd4751bc-b2dd-4323-83d8-45f639f1a72a\") " pod="openstack/nova-scheduler-0" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.806841 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpx9t\" (UniqueName: \"kubernetes.io/projected/cd4751bc-b2dd-4323-83d8-45f639f1a72a-kube-api-access-wpx9t\") pod \"nova-scheduler-0\" (UID: \"cd4751bc-b2dd-4323-83d8-45f639f1a72a\") " pod="openstack/nova-scheduler-0" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.806877 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4751bc-b2dd-4323-83d8-45f639f1a72a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd4751bc-b2dd-4323-83d8-45f639f1a72a\") " pod="openstack/nova-scheduler-0" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.814519 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4751bc-b2dd-4323-83d8-45f639f1a72a-config-data\") pod \"nova-scheduler-0\" (UID: \"cd4751bc-b2dd-4323-83d8-45f639f1a72a\") " pod="openstack/nova-scheduler-0" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.814949 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4751bc-b2dd-4323-83d8-45f639f1a72a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd4751bc-b2dd-4323-83d8-45f639f1a72a\") " pod="openstack/nova-scheduler-0" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.840238 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpx9t\" (UniqueName: \"kubernetes.io/projected/cd4751bc-b2dd-4323-83d8-45f639f1a72a-kube-api-access-wpx9t\") pod \"nova-scheduler-0\" (UID: \"cd4751bc-b2dd-4323-83d8-45f639f1a72a\") " pod="openstack/nova-scheduler-0" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.863417 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec06312-bef7-494b-95d0-0f15ee8860f8" path="/var/lib/kubelet/pods/dec06312-bef7-494b-95d0-0f15ee8860f8/volumes" Oct 04 02:59:18 crc kubenswrapper[4964]: I1004 02:59:18.995891 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 02:59:19 crc kubenswrapper[4964]: I1004 02:59:19.577579 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 02:59:19 crc kubenswrapper[4964]: I1004 02:59:19.965977 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.181:8775/\": read tcp 10.217.0.2:52886->10.217.0.181:8775: read: connection reset by peer" Oct 04 02:59:19 crc kubenswrapper[4964]: I1004 02:59:19.966019 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.181:8775/\": read tcp 10.217.0.2:52896->10.217.0.181:8775: read: connection reset by peer" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.449981 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.454332 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.551228 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-config-data\") pod \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.551301 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88896418-ceba-4933-90f2-cca7dcb2d0be-logs\") pod \"88896418-ceba-4933-90f2-cca7dcb2d0be\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.551356 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-internal-tls-certs\") pod \"88896418-ceba-4933-90f2-cca7dcb2d0be\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.551391 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-combined-ca-bundle\") pod \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.551427 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjkqn\" (UniqueName: \"kubernetes.io/projected/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-kube-api-access-bjkqn\") pod \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.551473 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-combined-ca-bundle\") pod \"88896418-ceba-4933-90f2-cca7dcb2d0be\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.551516 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhprk\" (UniqueName: \"kubernetes.io/projected/88896418-ceba-4933-90f2-cca7dcb2d0be-kube-api-access-lhprk\") pod \"88896418-ceba-4933-90f2-cca7dcb2d0be\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.551549 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-config-data\") pod \"88896418-ceba-4933-90f2-cca7dcb2d0be\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.551633 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-public-tls-certs\") pod \"88896418-ceba-4933-90f2-cca7dcb2d0be\" (UID: \"88896418-ceba-4933-90f2-cca7dcb2d0be\") " Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.551688 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-nova-metadata-tls-certs\") pod \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.551903 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-logs\") pod \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\" (UID: \"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa\") " Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.554660 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88896418-ceba-4933-90f2-cca7dcb2d0be-logs" (OuterVolumeSpecName: "logs") pod "88896418-ceba-4933-90f2-cca7dcb2d0be" (UID: "88896418-ceba-4933-90f2-cca7dcb2d0be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.554961 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-logs" (OuterVolumeSpecName: "logs") pod "e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" (UID: "e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.581773 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88896418-ceba-4933-90f2-cca7dcb2d0be-kube-api-access-lhprk" (OuterVolumeSpecName: "kube-api-access-lhprk") pod "88896418-ceba-4933-90f2-cca7dcb2d0be" (UID: "88896418-ceba-4933-90f2-cca7dcb2d0be"). InnerVolumeSpecName "kube-api-access-lhprk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.581857 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-kube-api-access-bjkqn" (OuterVolumeSpecName: "kube-api-access-bjkqn") pod "e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" (UID: "e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa"). InnerVolumeSpecName "kube-api-access-bjkqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.606361 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-config-data" (OuterVolumeSpecName: "config-data") pod "e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" (UID: "e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.608303 4964 generic.go:334] "Generic (PLEG): container finished" podID="88896418-ceba-4933-90f2-cca7dcb2d0be" containerID="4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b" exitCode=0 Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.608363 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88896418-ceba-4933-90f2-cca7dcb2d0be","Type":"ContainerDied","Data":"4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b"} Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.608390 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88896418-ceba-4933-90f2-cca7dcb2d0be","Type":"ContainerDied","Data":"6db6e9eebb2d26deb5a85daaf1c5ab7fc8b9a7b4c78b7339425d62ab2a5ab9c3"} Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.608405 4964 scope.go:117] "RemoveContainer" containerID="4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.608528 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.610847 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "88896418-ceba-4933-90f2-cca7dcb2d0be" (UID: "88896418-ceba-4933-90f2-cca7dcb2d0be"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.611743 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88896418-ceba-4933-90f2-cca7dcb2d0be" (UID: "88896418-ceba-4933-90f2-cca7dcb2d0be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.618665 4964 generic.go:334] "Generic (PLEG): container finished" podID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerID="d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b" exitCode=0 Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.618735 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa","Type":"ContainerDied","Data":"d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b"} Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.618761 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa","Type":"ContainerDied","Data":"cf2a48cf5e1b7935a7fb55c2384162c67fb730da5a8d3344021d1cc5222f523d"} Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.618874 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.620782 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd4751bc-b2dd-4323-83d8-45f639f1a72a","Type":"ContainerStarted","Data":"fe40422cbe807d66c72b55b2318281c0f75e43946828ff96cde23de8c799d17b"} Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.620817 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd4751bc-b2dd-4323-83d8-45f639f1a72a","Type":"ContainerStarted","Data":"ce443261158dfe5f9b00e0db8227202fa396254c5b256ec41f373eb091425985"} Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.622354 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-config-data" (OuterVolumeSpecName: "config-data") pod "88896418-ceba-4933-90f2-cca7dcb2d0be" (UID: "88896418-ceba-4933-90f2-cca7dcb2d0be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.638142 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "88896418-ceba-4933-90f2-cca7dcb2d0be" (UID: "88896418-ceba-4933-90f2-cca7dcb2d0be"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.640705 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.640674527 podStartE2EDuration="2.640674527s" podCreationTimestamp="2025-10-04 02:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:59:20.637664577 +0000 UTC m=+1140.534623225" watchObservedRunningTime="2025-10-04 02:59:20.640674527 +0000 UTC m=+1140.537633165" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.646991 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" (UID: "e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.648945 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" (UID: "e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.649342 4964 scope.go:117] "RemoveContainer" containerID="117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.655115 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.655138 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88896418-ceba-4933-90f2-cca7dcb2d0be-logs\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.655146 4964 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.655157 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.655166 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjkqn\" (UniqueName: \"kubernetes.io/projected/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-kube-api-access-bjkqn\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.655173 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.655182 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhprk\" (UniqueName: \"kubernetes.io/projected/88896418-ceba-4933-90f2-cca7dcb2d0be-kube-api-access-lhprk\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.655189 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.655198 4964 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88896418-ceba-4933-90f2-cca7dcb2d0be-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.655206 4964 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.655214 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa-logs\") on node \"crc\" DevicePath \"\"" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.674826 4964 scope.go:117] "RemoveContainer" containerID="4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b" Oct 04 02:59:20 crc kubenswrapper[4964]: E1004 02:59:20.675342 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b\": container with ID starting with 4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b not found: ID does not exist" containerID="4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.675388 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b"} err="failed to get container status \"4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b\": rpc error: code = NotFound desc = could not find container \"4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b\": container with ID starting with 4ad62df157fd04036cc0607b667fa893670a4da367d269d95018618409b6a69b not found: ID does not exist" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.675414 4964 scope.go:117] "RemoveContainer" containerID="117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784" Oct 04 02:59:20 crc kubenswrapper[4964]: E1004 02:59:20.675794 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784\": container with ID starting with 117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784 not found: ID does not exist" containerID="117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.675815 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784"} err="failed to get container status \"117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784\": rpc error: code = NotFound desc = could not find container \"117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784\": container with ID starting with 117d4ee4da22330cf6cb5ae59194f85f4ad921c52d4f4b67b2d2da8d25afb784 not found: ID does not exist" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.675826 4964 scope.go:117] "RemoveContainer" containerID="d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.701595 4964 scope.go:117] "RemoveContainer" containerID="8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.725576 4964 scope.go:117] "RemoveContainer" containerID="d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b" Oct 04 02:59:20 crc kubenswrapper[4964]: E1004 02:59:20.726003 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b\": container with ID starting with d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b not found: ID does not exist" containerID="d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.726030 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b"} err="failed to get container status \"d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b\": rpc error: code = NotFound desc = could not find container \"d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b\": container with ID starting with d2fca9dc482f392dbf2fed1cca5375d8faf65e3251036650aa6198ccbc252b3b not found: ID does not exist" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.726053 4964 scope.go:117] "RemoveContainer" containerID="8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1" Oct 04 02:59:20 crc kubenswrapper[4964]: E1004 02:59:20.726341 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1\": container with ID starting with 8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1 not found: ID does not exist" containerID="8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.726357 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1"} err="failed to get container status \"8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1\": rpc error: code = NotFound desc = could not find container \"8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1\": container with ID starting with 8a7afcf85651f3174fcbe431a4f31d4dcc2f0a64c80a6ac95fd5e74c9b0eaab1 not found: ID does not exist" Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.959847 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:59:20 crc kubenswrapper[4964]: I1004 02:59:20.996298 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.020250 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.028051 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.035160 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:59:21 crc kubenswrapper[4964]: E1004 02:59:21.035731 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerName="nova-metadata-metadata" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.035763 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerName="nova-metadata-metadata" Oct 04 02:59:21 crc kubenswrapper[4964]: E1004 02:59:21.035805 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerName="nova-metadata-log" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.035817 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerName="nova-metadata-log" Oct 04 02:59:21 crc kubenswrapper[4964]: E1004 02:59:21.035839 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88896418-ceba-4933-90f2-cca7dcb2d0be" containerName="nova-api-log" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.035851 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="88896418-ceba-4933-90f2-cca7dcb2d0be" containerName="nova-api-log" Oct 04 02:59:21 crc kubenswrapper[4964]: E1004 02:59:21.035874 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88896418-ceba-4933-90f2-cca7dcb2d0be" containerName="nova-api-api" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.035885 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="88896418-ceba-4933-90f2-cca7dcb2d0be" containerName="nova-api-api" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.036144 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="88896418-ceba-4933-90f2-cca7dcb2d0be" containerName="nova-api-api" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.036175 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerName="nova-metadata-log" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.036216 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="88896418-ceba-4933-90f2-cca7dcb2d0be" containerName="nova-api-log" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.036236 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" containerName="nova-metadata-metadata" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.037729 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.040332 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.040869 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.055528 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.059715 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.062769 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.063126 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.063482 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.064205 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.070880 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.164413 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-config-data\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.164496 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32379b6f-b326-4de2-800d-09cd730119d4-logs\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.164520 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32379b6f-b326-4de2-800d-09cd730119d4-public-tls-certs\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.164545 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32379b6f-b326-4de2-800d-09cd730119d4-config-data\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.164588 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-logs\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.164632 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6957\" (UniqueName: \"kubernetes.io/projected/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-kube-api-access-r6957\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.164662 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.164699 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32379b6f-b326-4de2-800d-09cd730119d4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.164853 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h826j\" (UniqueName: \"kubernetes.io/projected/32379b6f-b326-4de2-800d-09cd730119d4-kube-api-access-h826j\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.164912 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32379b6f-b326-4de2-800d-09cd730119d4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.165018 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.267083 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32379b6f-b326-4de2-800d-09cd730119d4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.267159 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h826j\" (UniqueName: \"kubernetes.io/projected/32379b6f-b326-4de2-800d-09cd730119d4-kube-api-access-h826j\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.267207 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32379b6f-b326-4de2-800d-09cd730119d4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.267249 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.267292 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-config-data\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.267347 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32379b6f-b326-4de2-800d-09cd730119d4-logs\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.268096 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32379b6f-b326-4de2-800d-09cd730119d4-public-tls-certs\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.268130 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32379b6f-b326-4de2-800d-09cd730119d4-config-data\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.267746 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32379b6f-b326-4de2-800d-09cd730119d4-logs\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.268181 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-logs\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.268205 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6957\" (UniqueName: \"kubernetes.io/projected/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-kube-api-access-r6957\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.268235 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.268730 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-logs\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.272428 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.274334 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32379b6f-b326-4de2-800d-09cd730119d4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.274347 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32379b6f-b326-4de2-800d-09cd730119d4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.274804 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32379b6f-b326-4de2-800d-09cd730119d4-public-tls-certs\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.274875 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32379b6f-b326-4de2-800d-09cd730119d4-config-data\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.282239 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-config-data\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.287126 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6957\" (UniqueName: \"kubernetes.io/projected/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-kube-api-access-r6957\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.287933 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcacbb93-ef68-4fdf-a30f-a7cd458809ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fcacbb93-ef68-4fdf-a30f-a7cd458809ae\") " pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.291721 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h826j\" (UniqueName: \"kubernetes.io/projected/32379b6f-b326-4de2-800d-09cd730119d4-kube-api-access-h826j\") pod \"nova-api-0\" (UID: \"32379b6f-b326-4de2-800d-09cd730119d4\") " pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.421888 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.430411 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.752680 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 02:59:21 crc kubenswrapper[4964]: W1004 02:59:21.760960 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcacbb93_ef68_4fdf_a30f_a7cd458809ae.slice/crio-b0b27dd4cb27385366a6550dd1191f3cf6b33323b409d2f0c1ca179793742767 WatchSource:0}: Error finding container b0b27dd4cb27385366a6550dd1191f3cf6b33323b409d2f0c1ca179793742767: Status 404 returned error can't find the container with id b0b27dd4cb27385366a6550dd1191f3cf6b33323b409d2f0c1ca179793742767 Oct 04 02:59:21 crc kubenswrapper[4964]: I1004 02:59:21.832788 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 02:59:21 crc kubenswrapper[4964]: W1004 02:59:21.837154 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32379b6f_b326_4de2_800d_09cd730119d4.slice/crio-86bde0b2c46daebbeb4d344811199e245dee1417796a4f15c67dcadec7e361ca WatchSource:0}: Error finding container 86bde0b2c46daebbeb4d344811199e245dee1417796a4f15c67dcadec7e361ca: Status 404 returned error can't find the container with id 86bde0b2c46daebbeb4d344811199e245dee1417796a4f15c67dcadec7e361ca Oct 04 02:59:22 crc kubenswrapper[4964]: I1004 02:59:22.658299 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fcacbb93-ef68-4fdf-a30f-a7cd458809ae","Type":"ContainerStarted","Data":"649faf9fa501404c8eb876e21c5a84ef4645ee01e8bf0220c135e4b6ff217053"} Oct 04 02:59:22 crc kubenswrapper[4964]: I1004 02:59:22.658553 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fcacbb93-ef68-4fdf-a30f-a7cd458809ae","Type":"ContainerStarted","Data":"5fe700b618e7af7e8bf16e412cb03df21f5656a6008cb7f35943594df6712e90"} Oct 04 02:59:22 crc kubenswrapper[4964]: I1004 02:59:22.658563 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fcacbb93-ef68-4fdf-a30f-a7cd458809ae","Type":"ContainerStarted","Data":"b0b27dd4cb27385366a6550dd1191f3cf6b33323b409d2f0c1ca179793742767"} Oct 04 02:59:22 crc kubenswrapper[4964]: I1004 02:59:22.660816 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32379b6f-b326-4de2-800d-09cd730119d4","Type":"ContainerStarted","Data":"fe14328c4b7a755603730dc639a88922f0c43881ab2175b10ad76fcb0e5a8faf"} Oct 04 02:59:22 crc kubenswrapper[4964]: I1004 02:59:22.660857 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32379b6f-b326-4de2-800d-09cd730119d4","Type":"ContainerStarted","Data":"ab2e97dd46f867f542ac51da06dc566ad60e33ae4b5515a46f5921426082e67e"} Oct 04 02:59:22 crc kubenswrapper[4964]: I1004 02:59:22.660869 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32379b6f-b326-4de2-800d-09cd730119d4","Type":"ContainerStarted","Data":"86bde0b2c46daebbeb4d344811199e245dee1417796a4f15c67dcadec7e361ca"} Oct 04 02:59:22 crc kubenswrapper[4964]: I1004 02:59:22.695727 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.695703033 podStartE2EDuration="2.695703033s" podCreationTimestamp="2025-10-04 02:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:59:22.688085729 +0000 UTC m=+1142.585044377" watchObservedRunningTime="2025-10-04 02:59:22.695703033 +0000 UTC m=+1142.592661701" Oct 04 02:59:22 crc kubenswrapper[4964]: I1004 02:59:22.714940 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.714910139 podStartE2EDuration="2.714910139s" podCreationTimestamp="2025-10-04 02:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 02:59:22.707530191 +0000 UTC m=+1142.604488839" watchObservedRunningTime="2025-10-04 02:59:22.714910139 +0000 UTC m=+1142.611868817" Oct 04 02:59:22 crc kubenswrapper[4964]: I1004 02:59:22.857539 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88896418-ceba-4933-90f2-cca7dcb2d0be" path="/var/lib/kubelet/pods/88896418-ceba-4933-90f2-cca7dcb2d0be/volumes" Oct 04 02:59:22 crc kubenswrapper[4964]: I1004 02:59:22.858383 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa" path="/var/lib/kubelet/pods/e215b1fa-007a-4b28-b4f7-36ebbeb2b3aa/volumes" Oct 04 02:59:23 crc kubenswrapper[4964]: I1004 02:59:23.996576 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 04 02:59:26 crc kubenswrapper[4964]: I1004 02:59:26.422577 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 04 02:59:26 crc kubenswrapper[4964]: I1004 02:59:26.423324 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 04 02:59:28 crc kubenswrapper[4964]: I1004 02:59:28.996976 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 04 02:59:29 crc kubenswrapper[4964]: I1004 02:59:29.040817 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 04 02:59:29 crc kubenswrapper[4964]: I1004 02:59:29.803076 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 04 02:59:30 crc kubenswrapper[4964]: I1004 02:59:30.041139 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 04 02:59:31 crc kubenswrapper[4964]: I1004 02:59:31.422252 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 04 02:59:31 crc kubenswrapper[4964]: I1004 02:59:31.422675 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 04 02:59:31 crc kubenswrapper[4964]: I1004 02:59:31.430787 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 02:59:31 crc kubenswrapper[4964]: I1004 02:59:31.430825 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 02:59:32 crc kubenswrapper[4964]: I1004 02:59:32.440532 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fcacbb93-ef68-4fdf-a30f-a7cd458809ae" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 02:59:32 crc kubenswrapper[4964]: I1004 02:59:32.455807 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fcacbb93-ef68-4fdf-a30f-a7cd458809ae" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 02:59:32 crc kubenswrapper[4964]: I1004 02:59:32.455863 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="32379b6f-b326-4de2-800d-09cd730119d4" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 02:59:32 crc kubenswrapper[4964]: I1004 02:59:32.457104 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="32379b6f-b326-4de2-800d-09cd730119d4" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 02:59:34 crc kubenswrapper[4964]: I1004 02:59:34.449338 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 02:59:34 crc kubenswrapper[4964]: I1004 02:59:34.449403 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 02:59:34 crc kubenswrapper[4964]: I1004 02:59:34.449469 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 02:59:34 crc kubenswrapper[4964]: I1004 02:59:34.450130 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be21f7ce532c3d058512254656f890799806a56eaaee57c83d963d0c90820139"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 02:59:34 crc kubenswrapper[4964]: I1004 02:59:34.450186 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://be21f7ce532c3d058512254656f890799806a56eaaee57c83d963d0c90820139" gracePeriod=600 Oct 04 02:59:34 crc kubenswrapper[4964]: I1004 02:59:34.813405 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="be21f7ce532c3d058512254656f890799806a56eaaee57c83d963d0c90820139" exitCode=0 Oct 04 02:59:34 crc kubenswrapper[4964]: I1004 02:59:34.813527 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"be21f7ce532c3d058512254656f890799806a56eaaee57c83d963d0c90820139"} Oct 04 02:59:34 crc kubenswrapper[4964]: I1004 02:59:34.813778 4964 scope.go:117] "RemoveContainer" containerID="008a6133f8963f8c25283a4615f3f65b17e14a1929e0bda2e812a4ec5ec09c24" Oct 04 02:59:35 crc kubenswrapper[4964]: I1004 02:59:35.830699 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"8ac79a9f0f6a28341a809172243b68b1f44d642d4092bfec8fae089b115215b9"} Oct 04 02:59:41 crc kubenswrapper[4964]: I1004 02:59:41.428732 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 04 02:59:41 crc kubenswrapper[4964]: I1004 02:59:41.429670 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 04 02:59:41 crc kubenswrapper[4964]: I1004 02:59:41.436347 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 04 02:59:41 crc kubenswrapper[4964]: I1004 02:59:41.441220 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 04 02:59:41 crc kubenswrapper[4964]: I1004 02:59:41.449875 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 04 02:59:41 crc kubenswrapper[4964]: I1004 02:59:41.450522 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 04 02:59:41 crc kubenswrapper[4964]: I1004 02:59:41.460715 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 04 02:59:41 crc kubenswrapper[4964]: I1004 02:59:41.462893 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 04 02:59:41 crc kubenswrapper[4964]: I1004 02:59:41.902450 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 04 02:59:41 crc kubenswrapper[4964]: I1004 02:59:41.913170 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 04 02:59:50 crc kubenswrapper[4964]: I1004 02:59:50.375990 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 02:59:51 crc kubenswrapper[4964]: I1004 02:59:51.631072 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 02:59:54 crc kubenswrapper[4964]: I1004 02:59:54.310014 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e87e52ad-66be-448b-b575-6d0acd8a8d4e" containerName="rabbitmq" containerID="cri-o://7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c" gracePeriod=604797 Oct 04 02:59:55 crc kubenswrapper[4964]: I1004 02:59:55.450883 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="58ea849f-c48c-473c-8608-694d254c47cf" containerName="rabbitmq" containerID="cri-o://6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e" gracePeriod=604797 Oct 04 02:59:57 crc kubenswrapper[4964]: I1004 02:59:57.525904 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e87e52ad-66be-448b-b575-6d0acd8a8d4e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Oct 04 02:59:57 crc kubenswrapper[4964]: I1004 02:59:57.807266 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="58ea849f-c48c-473c-8608-694d254c47cf" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.151093 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj"] Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.153659 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.162039 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.162480 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.163497 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj"] Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.265246 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aed3aad0-399d-4d5e-91cd-1fa0c65af611-config-volume\") pod \"collect-profiles-29325780-8nhcj\" (UID: \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.265532 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhjs9\" (UniqueName: \"kubernetes.io/projected/aed3aad0-399d-4d5e-91cd-1fa0c65af611-kube-api-access-bhjs9\") pod \"collect-profiles-29325780-8nhcj\" (UID: \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.265679 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aed3aad0-399d-4d5e-91cd-1fa0c65af611-secret-volume\") pod \"collect-profiles-29325780-8nhcj\" (UID: \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.367576 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhjs9\" (UniqueName: \"kubernetes.io/projected/aed3aad0-399d-4d5e-91cd-1fa0c65af611-kube-api-access-bhjs9\") pod \"collect-profiles-29325780-8nhcj\" (UID: \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.368037 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aed3aad0-399d-4d5e-91cd-1fa0c65af611-secret-volume\") pod \"collect-profiles-29325780-8nhcj\" (UID: \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.368213 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aed3aad0-399d-4d5e-91cd-1fa0c65af611-config-volume\") pod \"collect-profiles-29325780-8nhcj\" (UID: \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.369755 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aed3aad0-399d-4d5e-91cd-1fa0c65af611-config-volume\") pod \"collect-profiles-29325780-8nhcj\" (UID: \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.384351 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aed3aad0-399d-4d5e-91cd-1fa0c65af611-secret-volume\") pod \"collect-profiles-29325780-8nhcj\" (UID: \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.399716 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhjs9\" (UniqueName: \"kubernetes.io/projected/aed3aad0-399d-4d5e-91cd-1fa0c65af611-kube-api-access-bhjs9\") pod \"collect-profiles-29325780-8nhcj\" (UID: \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.476207 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.886062 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.976863 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-server-conf\") pod \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.976916 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-confd\") pod \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.976941 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-plugins-conf\") pod \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.976976 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e87e52ad-66be-448b-b575-6d0acd8a8d4e-erlang-cookie-secret\") pod \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.977043 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.977068 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e87e52ad-66be-448b-b575-6d0acd8a8d4e-pod-info\") pod \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.977136 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-erlang-cookie\") pod \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.977185 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwrrf\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-kube-api-access-bwrrf\") pod \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.977219 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-config-data\") pod \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.977245 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-plugins\") pod \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.977283 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-tls\") pod \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\" (UID: \"e87e52ad-66be-448b-b575-6d0acd8a8d4e\") " Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.979732 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e87e52ad-66be-448b-b575-6d0acd8a8d4e" (UID: "e87e52ad-66be-448b-b575-6d0acd8a8d4e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.980248 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e87e52ad-66be-448b-b575-6d0acd8a8d4e" (UID: "e87e52ad-66be-448b-b575-6d0acd8a8d4e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.981095 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e87e52ad-66be-448b-b575-6d0acd8a8d4e" (UID: "e87e52ad-66be-448b-b575-6d0acd8a8d4e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.983985 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-kube-api-access-bwrrf" (OuterVolumeSpecName: "kube-api-access-bwrrf") pod "e87e52ad-66be-448b-b575-6d0acd8a8d4e" (UID: "e87e52ad-66be-448b-b575-6d0acd8a8d4e"). InnerVolumeSpecName "kube-api-access-bwrrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.984209 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87e52ad-66be-448b-b575-6d0acd8a8d4e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e87e52ad-66be-448b-b575-6d0acd8a8d4e" (UID: "e87e52ad-66be-448b-b575-6d0acd8a8d4e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:00:00 crc kubenswrapper[4964]: I1004 03:00:00.995432 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "e87e52ad-66be-448b-b575-6d0acd8a8d4e" (UID: "e87e52ad-66be-448b-b575-6d0acd8a8d4e"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.001405 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e87e52ad-66be-448b-b575-6d0acd8a8d4e-pod-info" (OuterVolumeSpecName: "pod-info") pod "e87e52ad-66be-448b-b575-6d0acd8a8d4e" (UID: "e87e52ad-66be-448b-b575-6d0acd8a8d4e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.012784 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj"] Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.013891 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e87e52ad-66be-448b-b575-6d0acd8a8d4e" (UID: "e87e52ad-66be-448b-b575-6d0acd8a8d4e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.040643 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-config-data" (OuterVolumeSpecName: "config-data") pod "e87e52ad-66be-448b-b575-6d0acd8a8d4e" (UID: "e87e52ad-66be-448b-b575-6d0acd8a8d4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.080163 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.080195 4964 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.080205 4964 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.080213 4964 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.080221 4964 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e87e52ad-66be-448b-b575-6d0acd8a8d4e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.080251 4964 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.080261 4964 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e87e52ad-66be-448b-b575-6d0acd8a8d4e-pod-info\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.080270 4964 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.080278 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwrrf\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-kube-api-access-bwrrf\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.111031 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-server-conf" (OuterVolumeSpecName: "server-conf") pod "e87e52ad-66be-448b-b575-6d0acd8a8d4e" (UID: "e87e52ad-66be-448b-b575-6d0acd8a8d4e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.121432 4964 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.121531 4964 generic.go:334] "Generic (PLEG): container finished" podID="e87e52ad-66be-448b-b575-6d0acd8a8d4e" containerID="7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c" exitCode=0 Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.121604 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e87e52ad-66be-448b-b575-6d0acd8a8d4e","Type":"ContainerDied","Data":"7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c"} Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.121658 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e87e52ad-66be-448b-b575-6d0acd8a8d4e","Type":"ContainerDied","Data":"8c7b8f127f3fcdaba27bb59295f0f09db30a21fee9c9b59fc229353188048afc"} Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.121667 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.121679 4964 scope.go:117] "RemoveContainer" containerID="7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.126175 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" event={"ID":"aed3aad0-399d-4d5e-91cd-1fa0c65af611","Type":"ContainerStarted","Data":"67fa2f36ee25e68dcafaf7f7bf5f1d411c843ca25bbd9a14ecffc58fc7dc89a2"} Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.148051 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e87e52ad-66be-448b-b575-6d0acd8a8d4e" (UID: "e87e52ad-66be-448b-b575-6d0acd8a8d4e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.157765 4964 scope.go:117] "RemoveContainer" containerID="08db805bf8b88d305b35ca233a5a28aab7ae81c637374c238029a67c96628722" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.179712 4964 scope.go:117] "RemoveContainer" containerID="7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c" Oct 04 03:00:01 crc kubenswrapper[4964]: E1004 03:00:01.180066 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c\": container with ID starting with 7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c not found: ID does not exist" containerID="7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.180105 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c"} err="failed to get container status \"7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c\": rpc error: code = NotFound desc = could not find container \"7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c\": container with ID starting with 7eba40b5529e0287412e42c271cdd31aae6d9a13102a9e1a22d19ff95912006c not found: ID does not exist" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.180130 4964 scope.go:117] "RemoveContainer" containerID="08db805bf8b88d305b35ca233a5a28aab7ae81c637374c238029a67c96628722" Oct 04 03:00:01 crc kubenswrapper[4964]: E1004 03:00:01.180456 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08db805bf8b88d305b35ca233a5a28aab7ae81c637374c238029a67c96628722\": container with ID starting with 08db805bf8b88d305b35ca233a5a28aab7ae81c637374c238029a67c96628722 not found: ID does not exist" containerID="08db805bf8b88d305b35ca233a5a28aab7ae81c637374c238029a67c96628722" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.180482 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08db805bf8b88d305b35ca233a5a28aab7ae81c637374c238029a67c96628722"} err="failed to get container status \"08db805bf8b88d305b35ca233a5a28aab7ae81c637374c238029a67c96628722\": rpc error: code = NotFound desc = could not find container \"08db805bf8b88d305b35ca233a5a28aab7ae81c637374c238029a67c96628722\": container with ID starting with 08db805bf8b88d305b35ca233a5a28aab7ae81c637374c238029a67c96628722 not found: ID does not exist" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.181491 4964 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.181510 4964 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e87e52ad-66be-448b-b575-6d0acd8a8d4e-server-conf\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.181520 4964 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e87e52ad-66be-448b-b575-6d0acd8a8d4e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.512269 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.521289 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.538725 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 03:00:01 crc kubenswrapper[4964]: E1004 03:00:01.539072 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e87e52ad-66be-448b-b575-6d0acd8a8d4e" containerName="setup-container" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.539089 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e87e52ad-66be-448b-b575-6d0acd8a8d4e" containerName="setup-container" Oct 04 03:00:01 crc kubenswrapper[4964]: E1004 03:00:01.539105 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e87e52ad-66be-448b-b575-6d0acd8a8d4e" containerName="rabbitmq" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.539112 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e87e52ad-66be-448b-b575-6d0acd8a8d4e" containerName="rabbitmq" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.539282 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="e87e52ad-66be-448b-b575-6d0acd8a8d4e" containerName="rabbitmq" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.540164 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.544288 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.544365 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.544410 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.544525 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.544687 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.544760 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-k58qh" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.544691 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.560713 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.690499 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96c7a0c3-f572-4493-b028-bcbafee4dd24-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.690558 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96c7a0c3-f572-4493-b028-bcbafee4dd24-pod-info\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.690589 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96c7a0c3-f572-4493-b028-bcbafee4dd24-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.690627 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjwfw\" (UniqueName: \"kubernetes.io/projected/96c7a0c3-f572-4493-b028-bcbafee4dd24-kube-api-access-pjwfw\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.690668 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96c7a0c3-f572-4493-b028-bcbafee4dd24-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.690734 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96c7a0c3-f572-4493-b028-bcbafee4dd24-server-conf\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.690763 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96c7a0c3-f572-4493-b028-bcbafee4dd24-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.690818 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96c7a0c3-f572-4493-b028-bcbafee4dd24-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.690839 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96c7a0c3-f572-4493-b028-bcbafee4dd24-config-data\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.690887 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96c7a0c3-f572-4493-b028-bcbafee4dd24-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.690909 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.793531 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96c7a0c3-f572-4493-b028-bcbafee4dd24-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.793758 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96c7a0c3-f572-4493-b028-bcbafee4dd24-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.793921 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96c7a0c3-f572-4493-b028-bcbafee4dd24-server-conf\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.793993 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96c7a0c3-f572-4493-b028-bcbafee4dd24-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.794143 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96c7a0c3-f572-4493-b028-bcbafee4dd24-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.794179 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96c7a0c3-f572-4493-b028-bcbafee4dd24-config-data\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.794256 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96c7a0c3-f572-4493-b028-bcbafee4dd24-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.794286 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.794328 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96c7a0c3-f572-4493-b028-bcbafee4dd24-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.794371 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96c7a0c3-f572-4493-b028-bcbafee4dd24-pod-info\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.794409 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96c7a0c3-f572-4493-b028-bcbafee4dd24-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.794437 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjwfw\" (UniqueName: \"kubernetes.io/projected/96c7a0c3-f572-4493-b028-bcbafee4dd24-kube-api-access-pjwfw\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.795482 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96c7a0c3-f572-4493-b028-bcbafee4dd24-server-conf\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.795525 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96c7a0c3-f572-4493-b028-bcbafee4dd24-config-data\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.796142 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96c7a0c3-f572-4493-b028-bcbafee4dd24-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.796282 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.797950 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96c7a0c3-f572-4493-b028-bcbafee4dd24-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.800064 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96c7a0c3-f572-4493-b028-bcbafee4dd24-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.801211 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96c7a0c3-f572-4493-b028-bcbafee4dd24-pod-info\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.803046 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96c7a0c3-f572-4493-b028-bcbafee4dd24-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.815214 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96c7a0c3-f572-4493-b028-bcbafee4dd24-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.815380 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjwfw\" (UniqueName: \"kubernetes.io/projected/96c7a0c3-f572-4493-b028-bcbafee4dd24-kube-api-access-pjwfw\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.831292 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"96c7a0c3-f572-4493-b028-bcbafee4dd24\") " pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.859511 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 04 03:00:01 crc kubenswrapper[4964]: I1004 03:00:01.929863 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.100010 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-766kx\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-kube-api-access-766kx\") pod \"58ea849f-c48c-473c-8608-694d254c47cf\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.100346 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58ea849f-c48c-473c-8608-694d254c47cf-erlang-cookie-secret\") pod \"58ea849f-c48c-473c-8608-694d254c47cf\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.100395 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-config-data\") pod \"58ea849f-c48c-473c-8608-694d254c47cf\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.100455 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58ea849f-c48c-473c-8608-694d254c47cf-pod-info\") pod \"58ea849f-c48c-473c-8608-694d254c47cf\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.100487 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-tls\") pod \"58ea849f-c48c-473c-8608-694d254c47cf\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.100520 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-confd\") pod \"58ea849f-c48c-473c-8608-694d254c47cf\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.100587 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-erlang-cookie\") pod \"58ea849f-c48c-473c-8608-694d254c47cf\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.100636 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-plugins-conf\") pod \"58ea849f-c48c-473c-8608-694d254c47cf\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.100710 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-plugins\") pod \"58ea849f-c48c-473c-8608-694d254c47cf\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.100733 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-server-conf\") pod \"58ea849f-c48c-473c-8608-694d254c47cf\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.100764 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"58ea849f-c48c-473c-8608-694d254c47cf\" (UID: \"58ea849f-c48c-473c-8608-694d254c47cf\") " Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.101275 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "58ea849f-c48c-473c-8608-694d254c47cf" (UID: "58ea849f-c48c-473c-8608-694d254c47cf"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.103674 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ea849f-c48c-473c-8608-694d254c47cf-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "58ea849f-c48c-473c-8608-694d254c47cf" (UID: "58ea849f-c48c-473c-8608-694d254c47cf"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.103996 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "58ea849f-c48c-473c-8608-694d254c47cf" (UID: "58ea849f-c48c-473c-8608-694d254c47cf"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.104630 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "58ea849f-c48c-473c-8608-694d254c47cf" (UID: "58ea849f-c48c-473c-8608-694d254c47cf"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.105346 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "58ea849f-c48c-473c-8608-694d254c47cf" (UID: "58ea849f-c48c-473c-8608-694d254c47cf"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.106409 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-kube-api-access-766kx" (OuterVolumeSpecName: "kube-api-access-766kx") pod "58ea849f-c48c-473c-8608-694d254c47cf" (UID: "58ea849f-c48c-473c-8608-694d254c47cf"). InnerVolumeSpecName "kube-api-access-766kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.107070 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "58ea849f-c48c-473c-8608-694d254c47cf" (UID: "58ea849f-c48c-473c-8608-694d254c47cf"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.108757 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/58ea849f-c48c-473c-8608-694d254c47cf-pod-info" (OuterVolumeSpecName: "pod-info") pod "58ea849f-c48c-473c-8608-694d254c47cf" (UID: "58ea849f-c48c-473c-8608-694d254c47cf"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.138382 4964 generic.go:334] "Generic (PLEG): container finished" podID="aed3aad0-399d-4d5e-91cd-1fa0c65af611" containerID="5975933ad7114cb3bc088caf90d6f71c8c9e419f59fd25e77c9713c3cb606ae6" exitCode=0 Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.138437 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" event={"ID":"aed3aad0-399d-4d5e-91cd-1fa0c65af611","Type":"ContainerDied","Data":"5975933ad7114cb3bc088caf90d6f71c8c9e419f59fd25e77c9713c3cb606ae6"} Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.143431 4964 generic.go:334] "Generic (PLEG): container finished" podID="58ea849f-c48c-473c-8608-694d254c47cf" containerID="6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e" exitCode=0 Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.143477 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"58ea849f-c48c-473c-8608-694d254c47cf","Type":"ContainerDied","Data":"6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e"} Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.143499 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.143518 4964 scope.go:117] "RemoveContainer" containerID="6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.143506 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"58ea849f-c48c-473c-8608-694d254c47cf","Type":"ContainerDied","Data":"8b896c1c70823e0cf0b70ac85c19f1df0914f0fd7997c93ec60f1501a4db1da9"} Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.145822 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-config-data" (OuterVolumeSpecName: "config-data") pod "58ea849f-c48c-473c-8608-694d254c47cf" (UID: "58ea849f-c48c-473c-8608-694d254c47cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.172951 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-server-conf" (OuterVolumeSpecName: "server-conf") pod "58ea849f-c48c-473c-8608-694d254c47cf" (UID: "58ea849f-c48c-473c-8608-694d254c47cf"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.182934 4964 scope.go:117] "RemoveContainer" containerID="a520d0e8af5bf9aed07472b58ea6c9fdd00d5af00a084474f9d20db6d7bc2e60" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.202029 4964 scope.go:117] "RemoveContainer" containerID="6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e" Oct 04 03:00:02 crc kubenswrapper[4964]: E1004 03:00:02.202398 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e\": container with ID starting with 6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e not found: ID does not exist" containerID="6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.202463 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e"} err="failed to get container status \"6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e\": rpc error: code = NotFound desc = could not find container \"6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e\": container with ID starting with 6fe31342d7154218ae274b945e0de41c07dd0cdac895d00f76f419b97968eb9e not found: ID does not exist" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.202483 4964 scope.go:117] "RemoveContainer" containerID="a520d0e8af5bf9aed07472b58ea6c9fdd00d5af00a084474f9d20db6d7bc2e60" Oct 04 03:00:02 crc kubenswrapper[4964]: E1004 03:00:02.202854 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a520d0e8af5bf9aed07472b58ea6c9fdd00d5af00a084474f9d20db6d7bc2e60\": container with ID starting with a520d0e8af5bf9aed07472b58ea6c9fdd00d5af00a084474f9d20db6d7bc2e60 not found: ID does not exist" containerID="a520d0e8af5bf9aed07472b58ea6c9fdd00d5af00a084474f9d20db6d7bc2e60" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.202884 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a520d0e8af5bf9aed07472b58ea6c9fdd00d5af00a084474f9d20db6d7bc2e60"} err="failed to get container status \"a520d0e8af5bf9aed07472b58ea6c9fdd00d5af00a084474f9d20db6d7bc2e60\": rpc error: code = NotFound desc = could not find container \"a520d0e8af5bf9aed07472b58ea6c9fdd00d5af00a084474f9d20db6d7bc2e60\": container with ID starting with a520d0e8af5bf9aed07472b58ea6c9fdd00d5af00a084474f9d20db6d7bc2e60 not found: ID does not exist" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.203901 4964 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58ea849f-c48c-473c-8608-694d254c47cf-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.203929 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.203938 4964 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58ea849f-c48c-473c-8608-694d254c47cf-pod-info\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.203948 4964 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.203957 4964 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.203969 4964 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.203977 4964 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.203985 4964 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58ea849f-c48c-473c-8608-694d254c47cf-server-conf\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.204010 4964 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.204056 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-766kx\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-kube-api-access-766kx\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.217171 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "58ea849f-c48c-473c-8608-694d254c47cf" (UID: "58ea849f-c48c-473c-8608-694d254c47cf"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.223748 4964 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.307346 4964 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.307383 4964 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58ea849f-c48c-473c-8608-694d254c47cf-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.317789 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 03:00:02 crc kubenswrapper[4964]: W1004 03:00:02.324417 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96c7a0c3_f572_4493_b028_bcbafee4dd24.slice/crio-0817a15405532f84d7231c27a8a4391ea11e3aa3753d825d388d2863c9fc5bc0 WatchSource:0}: Error finding container 0817a15405532f84d7231c27a8a4391ea11e3aa3753d825d388d2863c9fc5bc0: Status 404 returned error can't find the container with id 0817a15405532f84d7231c27a8a4391ea11e3aa3753d825d388d2863c9fc5bc0 Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.548667 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.555353 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.589442 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 03:00:02 crc kubenswrapper[4964]: E1004 03:00:02.589812 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ea849f-c48c-473c-8608-694d254c47cf" containerName="setup-container" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.589829 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ea849f-c48c-473c-8608-694d254c47cf" containerName="setup-container" Oct 04 03:00:02 crc kubenswrapper[4964]: E1004 03:00:02.589844 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ea849f-c48c-473c-8608-694d254c47cf" containerName="rabbitmq" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.589851 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ea849f-c48c-473c-8608-694d254c47cf" containerName="rabbitmq" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.589994 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ea849f-c48c-473c-8608-694d254c47cf" containerName="rabbitmq" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.590840 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.593845 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.594202 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.594410 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s5vz9" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.594566 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.594809 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.595769 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.601964 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.602536 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.730111 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.730172 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.730203 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.730231 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4thq\" (UniqueName: \"kubernetes.io/projected/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-kube-api-access-h4thq\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.730323 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.730366 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.730389 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.730416 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.730440 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.730486 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.730509 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.831645 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.831692 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.831713 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.831733 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.831772 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.831805 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.831832 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.831853 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.831874 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.831895 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4thq\" (UniqueName: \"kubernetes.io/projected/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-kube-api-access-h4thq\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.831958 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.832654 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.832873 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.832951 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.833170 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.833519 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.833671 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.846484 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.846540 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.847163 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.847375 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.857942 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58ea849f-c48c-473c-8608-694d254c47cf" path="/var/lib/kubelet/pods/58ea849f-c48c-473c-8608-694d254c47cf/volumes" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.858876 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e87e52ad-66be-448b-b575-6d0acd8a8d4e" path="/var/lib/kubelet/pods/e87e52ad-66be-448b-b575-6d0acd8a8d4e/volumes" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.868506 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4thq\" (UniqueName: \"kubernetes.io/projected/8a662b31-7b7d-4491-bdc3-0b5c48b52f8c-kube-api-access-h4thq\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.881160 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:02 crc kubenswrapper[4964]: I1004 03:00:02.907892 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:03 crc kubenswrapper[4964]: I1004 03:00:03.156861 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96c7a0c3-f572-4493-b028-bcbafee4dd24","Type":"ContainerStarted","Data":"0817a15405532f84d7231c27a8a4391ea11e3aa3753d825d388d2863c9fc5bc0"} Oct 04 03:00:03 crc kubenswrapper[4964]: I1004 03:00:03.343442 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 03:00:03 crc kubenswrapper[4964]: I1004 03:00:03.671979 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:03 crc kubenswrapper[4964]: I1004 03:00:03.748063 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhjs9\" (UniqueName: \"kubernetes.io/projected/aed3aad0-399d-4d5e-91cd-1fa0c65af611-kube-api-access-bhjs9\") pod \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\" (UID: \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\") " Oct 04 03:00:03 crc kubenswrapper[4964]: I1004 03:00:03.748100 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aed3aad0-399d-4d5e-91cd-1fa0c65af611-secret-volume\") pod \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\" (UID: \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\") " Oct 04 03:00:03 crc kubenswrapper[4964]: I1004 03:00:03.748169 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aed3aad0-399d-4d5e-91cd-1fa0c65af611-config-volume\") pod \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\" (UID: \"aed3aad0-399d-4d5e-91cd-1fa0c65af611\") " Oct 04 03:00:03 crc kubenswrapper[4964]: I1004 03:00:03.749165 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aed3aad0-399d-4d5e-91cd-1fa0c65af611-config-volume" (OuterVolumeSpecName: "config-volume") pod "aed3aad0-399d-4d5e-91cd-1fa0c65af611" (UID: "aed3aad0-399d-4d5e-91cd-1fa0c65af611"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:03 crc kubenswrapper[4964]: I1004 03:00:03.754542 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed3aad0-399d-4d5e-91cd-1fa0c65af611-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "aed3aad0-399d-4d5e-91cd-1fa0c65af611" (UID: "aed3aad0-399d-4d5e-91cd-1fa0c65af611"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:00:03 crc kubenswrapper[4964]: I1004 03:00:03.755646 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed3aad0-399d-4d5e-91cd-1fa0c65af611-kube-api-access-bhjs9" (OuterVolumeSpecName: "kube-api-access-bhjs9") pod "aed3aad0-399d-4d5e-91cd-1fa0c65af611" (UID: "aed3aad0-399d-4d5e-91cd-1fa0c65af611"). InnerVolumeSpecName "kube-api-access-bhjs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:00:03 crc kubenswrapper[4964]: I1004 03:00:03.850344 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhjs9\" (UniqueName: \"kubernetes.io/projected/aed3aad0-399d-4d5e-91cd-1fa0c65af611-kube-api-access-bhjs9\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:03 crc kubenswrapper[4964]: I1004 03:00:03.850396 4964 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aed3aad0-399d-4d5e-91cd-1fa0c65af611-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:03 crc kubenswrapper[4964]: I1004 03:00:03.850415 4964 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aed3aad0-399d-4d5e-91cd-1fa0c65af611-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:04 crc kubenswrapper[4964]: I1004 03:00:04.187085 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" event={"ID":"aed3aad0-399d-4d5e-91cd-1fa0c65af611","Type":"ContainerDied","Data":"67fa2f36ee25e68dcafaf7f7bf5f1d411c843ca25bbd9a14ecffc58fc7dc89a2"} Oct 04 03:00:04 crc kubenswrapper[4964]: I1004 03:00:04.187474 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67fa2f36ee25e68dcafaf7f7bf5f1d411c843ca25bbd9a14ecffc58fc7dc89a2" Oct 04 03:00:04 crc kubenswrapper[4964]: I1004 03:00:04.187120 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj" Oct 04 03:00:04 crc kubenswrapper[4964]: I1004 03:00:04.190868 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96c7a0c3-f572-4493-b028-bcbafee4dd24","Type":"ContainerStarted","Data":"f74f2a2a25515b5d73d18b38e89db8d9ee2d8078c02302515182a3f0f0c585ee"} Oct 04 03:00:04 crc kubenswrapper[4964]: I1004 03:00:04.194410 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c","Type":"ContainerStarted","Data":"9c9deaf2b0719aa5ccb5e6a93bdc6002fab6c6fce35e8e7fe3d80de95be5bdbc"} Oct 04 03:00:05 crc kubenswrapper[4964]: I1004 03:00:05.215709 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c","Type":"ContainerStarted","Data":"a5456efd6ff5f6c426f02128e5766de5da5da153bd4cd8493a91538f521bdd62"} Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.164367 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-vvj4p"] Oct 04 03:00:06 crc kubenswrapper[4964]: E1004 03:00:06.165059 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed3aad0-399d-4d5e-91cd-1fa0c65af611" containerName="collect-profiles" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.165080 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed3aad0-399d-4d5e-91cd-1fa0c65af611" containerName="collect-profiles" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.165302 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed3aad0-399d-4d5e-91cd-1fa0c65af611" containerName="collect-profiles" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.166342 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.177338 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-vvj4p"] Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.178542 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.194056 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgm8g\" (UniqueName: \"kubernetes.io/projected/350305ec-cb73-4986-8a3b-64893679650f-kube-api-access-qgm8g\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.194124 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.194181 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.194256 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.194304 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.194373 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-config\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.299278 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.299398 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.299496 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-config\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.299564 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgm8g\" (UniqueName: \"kubernetes.io/projected/350305ec-cb73-4986-8a3b-64893679650f-kube-api-access-qgm8g\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.299717 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.299797 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.301726 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.303023 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.303776 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.303879 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.304059 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-config\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.323108 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgm8g\" (UniqueName: \"kubernetes.io/projected/350305ec-cb73-4986-8a3b-64893679650f-kube-api-access-qgm8g\") pod \"dnsmasq-dns-6447ccbd8f-vvj4p\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:06 crc kubenswrapper[4964]: I1004 03:00:06.488461 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:07 crc kubenswrapper[4964]: I1004 03:00:07.112872 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-vvj4p"] Oct 04 03:00:07 crc kubenswrapper[4964]: W1004 03:00:07.118406 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod350305ec_cb73_4986_8a3b_64893679650f.slice/crio-619cb8e840e84e36fbcec6498f0ed717a57ea8f0db45ceab38cf4328aa6274a9 WatchSource:0}: Error finding container 619cb8e840e84e36fbcec6498f0ed717a57ea8f0db45ceab38cf4328aa6274a9: Status 404 returned error can't find the container with id 619cb8e840e84e36fbcec6498f0ed717a57ea8f0db45ceab38cf4328aa6274a9 Oct 04 03:00:07 crc kubenswrapper[4964]: I1004 03:00:07.234013 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" event={"ID":"350305ec-cb73-4986-8a3b-64893679650f","Type":"ContainerStarted","Data":"619cb8e840e84e36fbcec6498f0ed717a57ea8f0db45ceab38cf4328aa6274a9"} Oct 04 03:00:08 crc kubenswrapper[4964]: I1004 03:00:08.248961 4964 generic.go:334] "Generic (PLEG): container finished" podID="350305ec-cb73-4986-8a3b-64893679650f" containerID="0b67f5a6a3d44ed288885f56f166ace05c15fbb89b287837411473c050f9c648" exitCode=0 Oct 04 03:00:08 crc kubenswrapper[4964]: I1004 03:00:08.249066 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" event={"ID":"350305ec-cb73-4986-8a3b-64893679650f","Type":"ContainerDied","Data":"0b67f5a6a3d44ed288885f56f166ace05c15fbb89b287837411473c050f9c648"} Oct 04 03:00:09 crc kubenswrapper[4964]: I1004 03:00:09.264405 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" event={"ID":"350305ec-cb73-4986-8a3b-64893679650f","Type":"ContainerStarted","Data":"326bed6a44f1a474d76bcf202866e79068e5f63f3696076fc2d9b564bad6fa24"} Oct 04 03:00:09 crc kubenswrapper[4964]: I1004 03:00:09.264862 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:09 crc kubenswrapper[4964]: I1004 03:00:09.301181 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" podStartSLOduration=3.301123423 podStartE2EDuration="3.301123423s" podCreationTimestamp="2025-10-04 03:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:00:09.298145693 +0000 UTC m=+1189.195104371" watchObservedRunningTime="2025-10-04 03:00:09.301123423 +0000 UTC m=+1189.198082091" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.489993 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.551906 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-nwzrj"] Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.552483 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" podUID="03b93a62-2bba-402a-9dcb-d6b501af6c4b" containerName="dnsmasq-dns" containerID="cri-o://fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0" gracePeriod=10 Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.707197 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-f97dv"] Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.708954 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.721544 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-f97dv"] Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.814378 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnzx\" (UniqueName: \"kubernetes.io/projected/2d8985fb-442a-4daa-bf40-9079a2e62aba-kube-api-access-rvnzx\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.814900 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.814967 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.815046 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-config\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.815156 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.815183 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.916829 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnzx\" (UniqueName: \"kubernetes.io/projected/2d8985fb-442a-4daa-bf40-9079a2e62aba-kube-api-access-rvnzx\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.917656 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.917688 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.917719 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-config\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.917790 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.917807 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.921778 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.922529 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.923041 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.923512 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-config\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.924003 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:16 crc kubenswrapper[4964]: I1004 03:00:16.943332 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnzx\" (UniqueName: \"kubernetes.io/projected/2d8985fb-442a-4daa-bf40-9079a2e62aba-kube-api-access-rvnzx\") pod \"dnsmasq-dns-864d5fc68c-f97dv\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.019201 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.033822 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.121300 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-ovsdbserver-sb\") pod \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.121456 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-ovsdbserver-nb\") pod \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.121505 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4psv\" (UniqueName: \"kubernetes.io/projected/03b93a62-2bba-402a-9dcb-d6b501af6c4b-kube-api-access-d4psv\") pod \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.121602 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-dns-svc\") pod \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.121712 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-config\") pod \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\" (UID: \"03b93a62-2bba-402a-9dcb-d6b501af6c4b\") " Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.128013 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b93a62-2bba-402a-9dcb-d6b501af6c4b-kube-api-access-d4psv" (OuterVolumeSpecName: "kube-api-access-d4psv") pod "03b93a62-2bba-402a-9dcb-d6b501af6c4b" (UID: "03b93a62-2bba-402a-9dcb-d6b501af6c4b"). InnerVolumeSpecName "kube-api-access-d4psv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.171251 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "03b93a62-2bba-402a-9dcb-d6b501af6c4b" (UID: "03b93a62-2bba-402a-9dcb-d6b501af6c4b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.174985 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "03b93a62-2bba-402a-9dcb-d6b501af6c4b" (UID: "03b93a62-2bba-402a-9dcb-d6b501af6c4b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.176635 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-config" (OuterVolumeSpecName: "config") pod "03b93a62-2bba-402a-9dcb-d6b501af6c4b" (UID: "03b93a62-2bba-402a-9dcb-d6b501af6c4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.185694 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "03b93a62-2bba-402a-9dcb-d6b501af6c4b" (UID: "03b93a62-2bba-402a-9dcb-d6b501af6c4b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.223454 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.223481 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4psv\" (UniqueName: \"kubernetes.io/projected/03b93a62-2bba-402a-9dcb-d6b501af6c4b-kube-api-access-d4psv\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.223492 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.223500 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-config\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.223508 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03b93a62-2bba-402a-9dcb-d6b501af6c4b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.368849 4964 generic.go:334] "Generic (PLEG): container finished" podID="03b93a62-2bba-402a-9dcb-d6b501af6c4b" containerID="fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0" exitCode=0 Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.368920 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" event={"ID":"03b93a62-2bba-402a-9dcb-d6b501af6c4b","Type":"ContainerDied","Data":"fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0"} Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.368963 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" event={"ID":"03b93a62-2bba-402a-9dcb-d6b501af6c4b","Type":"ContainerDied","Data":"f9977feb2ae54e71a3fe1757f3a1a693636ef7562207fd919af6f9d232233efa"} Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.368996 4964 scope.go:117] "RemoveContainer" containerID="fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.369063 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-nwzrj" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.405727 4964 scope.go:117] "RemoveContainer" containerID="e8d461c3ba423a9945400a3dfb634fe761490eb93da2e405c8b100208e551f0a" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.424883 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-nwzrj"] Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.431260 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-nwzrj"] Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.445327 4964 scope.go:117] "RemoveContainer" containerID="fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0" Oct 04 03:00:17 crc kubenswrapper[4964]: E1004 03:00:17.445925 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0\": container with ID starting with fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0 not found: ID does not exist" containerID="fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.445993 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0"} err="failed to get container status \"fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0\": rpc error: code = NotFound desc = could not find container \"fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0\": container with ID starting with fe0f863949acc1b1b253b6ba2fcadee3cd2a5d828c83595397ba8a9a4395bef0 not found: ID does not exist" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.446043 4964 scope.go:117] "RemoveContainer" containerID="e8d461c3ba423a9945400a3dfb634fe761490eb93da2e405c8b100208e551f0a" Oct 04 03:00:17 crc kubenswrapper[4964]: E1004 03:00:17.446651 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d461c3ba423a9945400a3dfb634fe761490eb93da2e405c8b100208e551f0a\": container with ID starting with e8d461c3ba423a9945400a3dfb634fe761490eb93da2e405c8b100208e551f0a not found: ID does not exist" containerID="e8d461c3ba423a9945400a3dfb634fe761490eb93da2e405c8b100208e551f0a" Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.446676 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d461c3ba423a9945400a3dfb634fe761490eb93da2e405c8b100208e551f0a"} err="failed to get container status \"e8d461c3ba423a9945400a3dfb634fe761490eb93da2e405c8b100208e551f0a\": rpc error: code = NotFound desc = could not find container \"e8d461c3ba423a9945400a3dfb634fe761490eb93da2e405c8b100208e551f0a\": container with ID starting with e8d461c3ba423a9945400a3dfb634fe761490eb93da2e405c8b100208e551f0a not found: ID does not exist" Oct 04 03:00:17 crc kubenswrapper[4964]: W1004 03:00:17.491295 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d8985fb_442a_4daa_bf40_9079a2e62aba.slice/crio-015542d47f2c70243c6113e88c5c32252ef4e64582a93add3479d68d345bca2b WatchSource:0}: Error finding container 015542d47f2c70243c6113e88c5c32252ef4e64582a93add3479d68d345bca2b: Status 404 returned error can't find the container with id 015542d47f2c70243c6113e88c5c32252ef4e64582a93add3479d68d345bca2b Oct 04 03:00:17 crc kubenswrapper[4964]: I1004 03:00:17.492294 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-f97dv"] Oct 04 03:00:18 crc kubenswrapper[4964]: I1004 03:00:18.378408 4964 generic.go:334] "Generic (PLEG): container finished" podID="2d8985fb-442a-4daa-bf40-9079a2e62aba" containerID="98c5e85573c9bbd808758517c70c4022c10d8eb1035478b0e7f58b8bb3eba973" exitCode=0 Oct 04 03:00:18 crc kubenswrapper[4964]: I1004 03:00:18.378528 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" event={"ID":"2d8985fb-442a-4daa-bf40-9079a2e62aba","Type":"ContainerDied","Data":"98c5e85573c9bbd808758517c70c4022c10d8eb1035478b0e7f58b8bb3eba973"} Oct 04 03:00:18 crc kubenswrapper[4964]: I1004 03:00:18.378877 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" event={"ID":"2d8985fb-442a-4daa-bf40-9079a2e62aba","Type":"ContainerStarted","Data":"015542d47f2c70243c6113e88c5c32252ef4e64582a93add3479d68d345bca2b"} Oct 04 03:00:18 crc kubenswrapper[4964]: I1004 03:00:18.871454 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03b93a62-2bba-402a-9dcb-d6b501af6c4b" path="/var/lib/kubelet/pods/03b93a62-2bba-402a-9dcb-d6b501af6c4b/volumes" Oct 04 03:00:19 crc kubenswrapper[4964]: I1004 03:00:19.405114 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" event={"ID":"2d8985fb-442a-4daa-bf40-9079a2e62aba","Type":"ContainerStarted","Data":"aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138"} Oct 04 03:00:19 crc kubenswrapper[4964]: I1004 03:00:19.405336 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:19 crc kubenswrapper[4964]: I1004 03:00:19.439281 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" podStartSLOduration=3.439254672 podStartE2EDuration="3.439254672s" podCreationTimestamp="2025-10-04 03:00:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:00:19.434780612 +0000 UTC m=+1199.331739290" watchObservedRunningTime="2025-10-04 03:00:19.439254672 +0000 UTC m=+1199.336213340" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.315698 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw"] Oct 04 03:00:22 crc kubenswrapper[4964]: E1004 03:00:22.316430 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b93a62-2bba-402a-9dcb-d6b501af6c4b" containerName="dnsmasq-dns" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.316441 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b93a62-2bba-402a-9dcb-d6b501af6c4b" containerName="dnsmasq-dns" Oct 04 03:00:22 crc kubenswrapper[4964]: E1004 03:00:22.316460 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b93a62-2bba-402a-9dcb-d6b501af6c4b" containerName="init" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.316466 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b93a62-2bba-402a-9dcb-d6b501af6c4b" containerName="init" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.316665 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b93a62-2bba-402a-9dcb-d6b501af6c4b" containerName="dnsmasq-dns" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.317232 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.321324 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.321573 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.322322 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.322363 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.349688 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw"] Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.443581 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw45r\" (UniqueName: \"kubernetes.io/projected/7ffdf276-5128-4a2e-8ccf-18210ada6acf-kube-api-access-fw45r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.443633 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.443658 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.443767 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.546051 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw45r\" (UniqueName: \"kubernetes.io/projected/7ffdf276-5128-4a2e-8ccf-18210ada6acf-kube-api-access-fw45r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.546105 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.546137 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.546270 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.553389 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.553796 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.555031 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.568495 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw45r\" (UniqueName: \"kubernetes.io/projected/7ffdf276-5128-4a2e-8ccf-18210ada6acf-kube-api-access-fw45r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:22 crc kubenswrapper[4964]: I1004 03:00:22.696776 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:23 crc kubenswrapper[4964]: I1004 03:00:23.267230 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 03:00:23 crc kubenswrapper[4964]: I1004 03:00:23.268536 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw"] Oct 04 03:00:23 crc kubenswrapper[4964]: I1004 03:00:23.452594 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" event={"ID":"7ffdf276-5128-4a2e-8ccf-18210ada6acf","Type":"ContainerStarted","Data":"97dcfa01ec137516cc3c97548807e3606f3af274ad1eac7de246f87e187d2d31"} Oct 04 03:00:25 crc kubenswrapper[4964]: I1004 03:00:25.166252 4964 scope.go:117] "RemoveContainer" containerID="06bcb6f4f12ef71723138814d4c15774e1ffe64b1c8461b644c26b78fc22fb15" Oct 04 03:00:25 crc kubenswrapper[4964]: I1004 03:00:25.199867 4964 scope.go:117] "RemoveContainer" containerID="e63c1ee3e8f285ccadede1dc095151d3e035347bfef096b438a385574d8542ad" Oct 04 03:00:25 crc kubenswrapper[4964]: I1004 03:00:25.231095 4964 scope.go:117] "RemoveContainer" containerID="acd1fb7361d733b7aa02046d2b75696f89842e04659a38d8fbf1adf2d4563855" Oct 04 03:00:27 crc kubenswrapper[4964]: I1004 03:00:27.035780 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:00:27 crc kubenswrapper[4964]: I1004 03:00:27.117387 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-vvj4p"] Oct 04 03:00:27 crc kubenswrapper[4964]: I1004 03:00:27.118606 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" podUID="350305ec-cb73-4986-8a3b-64893679650f" containerName="dnsmasq-dns" containerID="cri-o://326bed6a44f1a474d76bcf202866e79068e5f63f3696076fc2d9b564bad6fa24" gracePeriod=10 Oct 04 03:00:27 crc kubenswrapper[4964]: I1004 03:00:27.492902 4964 generic.go:334] "Generic (PLEG): container finished" podID="350305ec-cb73-4986-8a3b-64893679650f" containerID="326bed6a44f1a474d76bcf202866e79068e5f63f3696076fc2d9b564bad6fa24" exitCode=0 Oct 04 03:00:27 crc kubenswrapper[4964]: I1004 03:00:27.492946 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" event={"ID":"350305ec-cb73-4986-8a3b-64893679650f","Type":"ContainerDied","Data":"326bed6a44f1a474d76bcf202866e79068e5f63f3696076fc2d9b564bad6fa24"} Oct 04 03:00:31 crc kubenswrapper[4964]: I1004 03:00:31.488863 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" podUID="350305ec-cb73-4986-8a3b-64893679650f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.194:5353: connect: connection refused" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.472028 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.542028 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-ovsdbserver-sb\") pod \"350305ec-cb73-4986-8a3b-64893679650f\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.542160 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-ovsdbserver-nb\") pod \"350305ec-cb73-4986-8a3b-64893679650f\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.542218 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-dns-svc\") pod \"350305ec-cb73-4986-8a3b-64893679650f\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.542254 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-config\") pod \"350305ec-cb73-4986-8a3b-64893679650f\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.542341 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgm8g\" (UniqueName: \"kubernetes.io/projected/350305ec-cb73-4986-8a3b-64893679650f-kube-api-access-qgm8g\") pod \"350305ec-cb73-4986-8a3b-64893679650f\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.542432 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-openstack-edpm-ipam\") pod \"350305ec-cb73-4986-8a3b-64893679650f\" (UID: \"350305ec-cb73-4986-8a3b-64893679650f\") " Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.546400 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" event={"ID":"7ffdf276-5128-4a2e-8ccf-18210ada6acf","Type":"ContainerStarted","Data":"5b99ce90eb7266f89e6c9bf905b48a6c5c4d4f12f1dfe0b37020b5bd30e410b4"} Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.548007 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/350305ec-cb73-4986-8a3b-64893679650f-kube-api-access-qgm8g" (OuterVolumeSpecName: "kube-api-access-qgm8g") pod "350305ec-cb73-4986-8a3b-64893679650f" (UID: "350305ec-cb73-4986-8a3b-64893679650f"). InnerVolumeSpecName "kube-api-access-qgm8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.550045 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" event={"ID":"350305ec-cb73-4986-8a3b-64893679650f","Type":"ContainerDied","Data":"619cb8e840e84e36fbcec6498f0ed717a57ea8f0db45ceab38cf4328aa6274a9"} Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.550085 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-vvj4p" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.550102 4964 scope.go:117] "RemoveContainer" containerID="326bed6a44f1a474d76bcf202866e79068e5f63f3696076fc2d9b564bad6fa24" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.573518 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" podStartSLOduration=1.691809721 podStartE2EDuration="10.573497513s" podCreationTimestamp="2025-10-04 03:00:22 +0000 UTC" firstStartedPulling="2025-10-04 03:00:23.266668972 +0000 UTC m=+1203.163627650" lastFinishedPulling="2025-10-04 03:00:32.148356794 +0000 UTC m=+1212.045315442" observedRunningTime="2025-10-04 03:00:32.565091497 +0000 UTC m=+1212.462050145" watchObservedRunningTime="2025-10-04 03:00:32.573497513 +0000 UTC m=+1212.470456161" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.592007 4964 scope.go:117] "RemoveContainer" containerID="0b67f5a6a3d44ed288885f56f166ace05c15fbb89b287837411473c050f9c648" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.601715 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-config" (OuterVolumeSpecName: "config") pod "350305ec-cb73-4986-8a3b-64893679650f" (UID: "350305ec-cb73-4986-8a3b-64893679650f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.610188 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "350305ec-cb73-4986-8a3b-64893679650f" (UID: "350305ec-cb73-4986-8a3b-64893679650f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.619891 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "350305ec-cb73-4986-8a3b-64893679650f" (UID: "350305ec-cb73-4986-8a3b-64893679650f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.620048 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "350305ec-cb73-4986-8a3b-64893679650f" (UID: "350305ec-cb73-4986-8a3b-64893679650f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.627787 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "350305ec-cb73-4986-8a3b-64893679650f" (UID: "350305ec-cb73-4986-8a3b-64893679650f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.645295 4964 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.645328 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.645341 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.645349 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.645360 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/350305ec-cb73-4986-8a3b-64893679650f-config\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.645368 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgm8g\" (UniqueName: \"kubernetes.io/projected/350305ec-cb73-4986-8a3b-64893679650f-kube-api-access-qgm8g\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.900002 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-vvj4p"] Oct 04 03:00:32 crc kubenswrapper[4964]: I1004 03:00:32.913101 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-vvj4p"] Oct 04 03:00:34 crc kubenswrapper[4964]: I1004 03:00:34.863828 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="350305ec-cb73-4986-8a3b-64893679650f" path="/var/lib/kubelet/pods/350305ec-cb73-4986-8a3b-64893679650f/volumes" Oct 04 03:00:37 crc kubenswrapper[4964]: I1004 03:00:37.608148 4964 generic.go:334] "Generic (PLEG): container finished" podID="96c7a0c3-f572-4493-b028-bcbafee4dd24" containerID="f74f2a2a25515b5d73d18b38e89db8d9ee2d8078c02302515182a3f0f0c585ee" exitCode=0 Oct 04 03:00:37 crc kubenswrapper[4964]: I1004 03:00:37.608275 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96c7a0c3-f572-4493-b028-bcbafee4dd24","Type":"ContainerDied","Data":"f74f2a2a25515b5d73d18b38e89db8d9ee2d8078c02302515182a3f0f0c585ee"} Oct 04 03:00:37 crc kubenswrapper[4964]: I1004 03:00:37.610529 4964 generic.go:334] "Generic (PLEG): container finished" podID="8a662b31-7b7d-4491-bdc3-0b5c48b52f8c" containerID="a5456efd6ff5f6c426f02128e5766de5da5da153bd4cd8493a91538f521bdd62" exitCode=0 Oct 04 03:00:37 crc kubenswrapper[4964]: I1004 03:00:37.610571 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c","Type":"ContainerDied","Data":"a5456efd6ff5f6c426f02128e5766de5da5da153bd4cd8493a91538f521bdd62"} Oct 04 03:00:38 crc kubenswrapper[4964]: I1004 03:00:38.619342 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8a662b31-7b7d-4491-bdc3-0b5c48b52f8c","Type":"ContainerStarted","Data":"0f9b0ea23a016e29df9973a76bf128f7c5ea7a0cacb12faf99a6a4488bd09987"} Oct 04 03:00:38 crc kubenswrapper[4964]: I1004 03:00:38.620747 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:00:38 crc kubenswrapper[4964]: I1004 03:00:38.626802 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96c7a0c3-f572-4493-b028-bcbafee4dd24","Type":"ContainerStarted","Data":"a0362a98e3c2f2dcfdf8d447e29a1759b980fc47f8424a2f3c5d5c04521e8bd0"} Oct 04 03:00:38 crc kubenswrapper[4964]: I1004 03:00:38.627552 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 04 03:00:38 crc kubenswrapper[4964]: I1004 03:00:38.645756 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.645740616 podStartE2EDuration="36.645740616s" podCreationTimestamp="2025-10-04 03:00:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:00:38.64437861 +0000 UTC m=+1218.541337248" watchObservedRunningTime="2025-10-04 03:00:38.645740616 +0000 UTC m=+1218.542699254" Oct 04 03:00:38 crc kubenswrapper[4964]: I1004 03:00:38.690770 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.690752165 podStartE2EDuration="37.690752165s" podCreationTimestamp="2025-10-04 03:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:00:38.67717745 +0000 UTC m=+1218.574136098" watchObservedRunningTime="2025-10-04 03:00:38.690752165 +0000 UTC m=+1218.587710803" Oct 04 03:00:43 crc kubenswrapper[4964]: I1004 03:00:43.700457 4964 generic.go:334] "Generic (PLEG): container finished" podID="7ffdf276-5128-4a2e-8ccf-18210ada6acf" containerID="5b99ce90eb7266f89e6c9bf905b48a6c5c4d4f12f1dfe0b37020b5bd30e410b4" exitCode=0 Oct 04 03:00:43 crc kubenswrapper[4964]: I1004 03:00:43.700599 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" event={"ID":"7ffdf276-5128-4a2e-8ccf-18210ada6acf","Type":"ContainerDied","Data":"5b99ce90eb7266f89e6c9bf905b48a6c5c4d4f12f1dfe0b37020b5bd30e410b4"} Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.126053 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.197099 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-inventory\") pod \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.197144 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-repo-setup-combined-ca-bundle\") pod \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.197286 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-ssh-key\") pod \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.197336 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw45r\" (UniqueName: \"kubernetes.io/projected/7ffdf276-5128-4a2e-8ccf-18210ada6acf-kube-api-access-fw45r\") pod \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\" (UID: \"7ffdf276-5128-4a2e-8ccf-18210ada6acf\") " Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.204765 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7ffdf276-5128-4a2e-8ccf-18210ada6acf" (UID: "7ffdf276-5128-4a2e-8ccf-18210ada6acf"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.221088 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ffdf276-5128-4a2e-8ccf-18210ada6acf-kube-api-access-fw45r" (OuterVolumeSpecName: "kube-api-access-fw45r") pod "7ffdf276-5128-4a2e-8ccf-18210ada6acf" (UID: "7ffdf276-5128-4a2e-8ccf-18210ada6acf"). InnerVolumeSpecName "kube-api-access-fw45r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.231765 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-inventory" (OuterVolumeSpecName: "inventory") pod "7ffdf276-5128-4a2e-8ccf-18210ada6acf" (UID: "7ffdf276-5128-4a2e-8ccf-18210ada6acf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.253110 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7ffdf276-5128-4a2e-8ccf-18210ada6acf" (UID: "7ffdf276-5128-4a2e-8ccf-18210ada6acf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.299789 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.299829 4964 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.299842 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ffdf276-5128-4a2e-8ccf-18210ada6acf-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.299874 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw45r\" (UniqueName: \"kubernetes.io/projected/7ffdf276-5128-4a2e-8ccf-18210ada6acf-kube-api-access-fw45r\") on node \"crc\" DevicePath \"\"" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.724137 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.724142 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw" event={"ID":"7ffdf276-5128-4a2e-8ccf-18210ada6acf","Type":"ContainerDied","Data":"97dcfa01ec137516cc3c97548807e3606f3af274ad1eac7de246f87e187d2d31"} Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.724628 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97dcfa01ec137516cc3c97548807e3606f3af274ad1eac7de246f87e187d2d31" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.817553 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx"] Oct 04 03:00:45 crc kubenswrapper[4964]: E1004 03:00:45.818011 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350305ec-cb73-4986-8a3b-64893679650f" containerName="dnsmasq-dns" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.818030 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="350305ec-cb73-4986-8a3b-64893679650f" containerName="dnsmasq-dns" Oct 04 03:00:45 crc kubenswrapper[4964]: E1004 03:00:45.818060 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ffdf276-5128-4a2e-8ccf-18210ada6acf" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.818070 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ffdf276-5128-4a2e-8ccf-18210ada6acf" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 04 03:00:45 crc kubenswrapper[4964]: E1004 03:00:45.818086 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350305ec-cb73-4986-8a3b-64893679650f" containerName="init" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.818243 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="350305ec-cb73-4986-8a3b-64893679650f" containerName="init" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.818587 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ffdf276-5128-4a2e-8ccf-18210ada6acf" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.818651 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="350305ec-cb73-4986-8a3b-64893679650f" containerName="dnsmasq-dns" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.819748 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.827018 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx"] Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.827252 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.827493 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.829691 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.829952 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.919864 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.920064 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.920354 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvmdq\" (UniqueName: \"kubernetes.io/projected/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-kube-api-access-wvmdq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:45 crc kubenswrapper[4964]: I1004 03:00:45.920530 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:46 crc kubenswrapper[4964]: I1004 03:00:46.022331 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvmdq\" (UniqueName: \"kubernetes.io/projected/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-kube-api-access-wvmdq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:46 crc kubenswrapper[4964]: I1004 03:00:46.022416 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:46 crc kubenswrapper[4964]: I1004 03:00:46.022467 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:46 crc kubenswrapper[4964]: I1004 03:00:46.022496 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:46 crc kubenswrapper[4964]: I1004 03:00:46.027053 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:46 crc kubenswrapper[4964]: I1004 03:00:46.029279 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:46 crc kubenswrapper[4964]: I1004 03:00:46.032078 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:46 crc kubenswrapper[4964]: I1004 03:00:46.042816 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvmdq\" (UniqueName: \"kubernetes.io/projected/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-kube-api-access-wvmdq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:46 crc kubenswrapper[4964]: I1004 03:00:46.139400 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:00:46 crc kubenswrapper[4964]: I1004 03:00:46.739304 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx"] Oct 04 03:00:47 crc kubenswrapper[4964]: I1004 03:00:47.757538 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" event={"ID":"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4","Type":"ContainerStarted","Data":"fc93f8897872fbc7302743cac1691820166790a59b5676def40e260ba6fe729f"} Oct 04 03:00:47 crc kubenswrapper[4964]: I1004 03:00:47.758019 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" event={"ID":"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4","Type":"ContainerStarted","Data":"53cefb04b751d4427e944166e07ef7b01b407a1a281ffa0a7b680062be1e35b2"} Oct 04 03:00:51 crc kubenswrapper[4964]: I1004 03:00:51.865157 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 04 03:00:51 crc kubenswrapper[4964]: I1004 03:00:51.911843 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" podStartSLOduration=6.368970913 podStartE2EDuration="6.911821473s" podCreationTimestamp="2025-10-04 03:00:45 +0000 UTC" firstStartedPulling="2025-10-04 03:00:46.743533194 +0000 UTC m=+1226.640491832" lastFinishedPulling="2025-10-04 03:00:47.286383744 +0000 UTC m=+1227.183342392" observedRunningTime="2025-10-04 03:00:47.782112008 +0000 UTC m=+1227.679070656" watchObservedRunningTime="2025-10-04 03:00:51.911821473 +0000 UTC m=+1231.808780121" Oct 04 03:00:52 crc kubenswrapper[4964]: I1004 03:00:52.910784 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.152441 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29325781-z42x5"] Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.155549 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.177324 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29325781-z42x5"] Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.243548 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-fernet-keys\") pod \"keystone-cron-29325781-z42x5\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.243890 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdldw\" (UniqueName: \"kubernetes.io/projected/4d97553d-eed6-464e-ac05-89e3591b8bb0-kube-api-access-qdldw\") pod \"keystone-cron-29325781-z42x5\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.244043 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-combined-ca-bundle\") pod \"keystone-cron-29325781-z42x5\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.244112 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-config-data\") pod \"keystone-cron-29325781-z42x5\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.346637 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-fernet-keys\") pod \"keystone-cron-29325781-z42x5\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.346800 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdldw\" (UniqueName: \"kubernetes.io/projected/4d97553d-eed6-464e-ac05-89e3591b8bb0-kube-api-access-qdldw\") pod \"keystone-cron-29325781-z42x5\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.346890 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-combined-ca-bundle\") pod \"keystone-cron-29325781-z42x5\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.346933 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-config-data\") pod \"keystone-cron-29325781-z42x5\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.353861 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-config-data\") pod \"keystone-cron-29325781-z42x5\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.373573 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-fernet-keys\") pod \"keystone-cron-29325781-z42x5\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.374754 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-combined-ca-bundle\") pod \"keystone-cron-29325781-z42x5\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.378518 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdldw\" (UniqueName: \"kubernetes.io/projected/4d97553d-eed6-464e-ac05-89e3591b8bb0-kube-api-access-qdldw\") pod \"keystone-cron-29325781-z42x5\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.493148 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:00 crc kubenswrapper[4964]: I1004 03:01:00.998440 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29325781-z42x5"] Oct 04 03:01:01 crc kubenswrapper[4964]: W1004 03:01:01.000548 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d97553d_eed6_464e_ac05_89e3591b8bb0.slice/crio-6e4483132a7e957f0c0c965012c4089ad9c43ed24d300b12303d5784bcf14ad1 WatchSource:0}: Error finding container 6e4483132a7e957f0c0c965012c4089ad9c43ed24d300b12303d5784bcf14ad1: Status 404 returned error can't find the container with id 6e4483132a7e957f0c0c965012c4089ad9c43ed24d300b12303d5784bcf14ad1 Oct 04 03:01:01 crc kubenswrapper[4964]: I1004 03:01:01.932254 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325781-z42x5" event={"ID":"4d97553d-eed6-464e-ac05-89e3591b8bb0","Type":"ContainerStarted","Data":"cf17fe3eff60f543081cf963b8a435a2470a0df974423fd97d1ab6ec1051da19"} Oct 04 03:01:01 crc kubenswrapper[4964]: I1004 03:01:01.932662 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325781-z42x5" event={"ID":"4d97553d-eed6-464e-ac05-89e3591b8bb0","Type":"ContainerStarted","Data":"6e4483132a7e957f0c0c965012c4089ad9c43ed24d300b12303d5784bcf14ad1"} Oct 04 03:01:01 crc kubenswrapper[4964]: I1004 03:01:01.957760 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29325781-z42x5" podStartSLOduration=1.9577387929999999 podStartE2EDuration="1.957738793s" podCreationTimestamp="2025-10-04 03:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:01:01.952811822 +0000 UTC m=+1241.849770490" watchObservedRunningTime="2025-10-04 03:01:01.957738793 +0000 UTC m=+1241.854697451" Oct 04 03:01:03 crc kubenswrapper[4964]: I1004 03:01:03.959391 4964 generic.go:334] "Generic (PLEG): container finished" podID="4d97553d-eed6-464e-ac05-89e3591b8bb0" containerID="cf17fe3eff60f543081cf963b8a435a2470a0df974423fd97d1ab6ec1051da19" exitCode=0 Oct 04 03:01:03 crc kubenswrapper[4964]: I1004 03:01:03.959504 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325781-z42x5" event={"ID":"4d97553d-eed6-464e-ac05-89e3591b8bb0","Type":"ContainerDied","Data":"cf17fe3eff60f543081cf963b8a435a2470a0df974423fd97d1ab6ec1051da19"} Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.433828 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.551547 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-combined-ca-bundle\") pod \"4d97553d-eed6-464e-ac05-89e3591b8bb0\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.551683 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdldw\" (UniqueName: \"kubernetes.io/projected/4d97553d-eed6-464e-ac05-89e3591b8bb0-kube-api-access-qdldw\") pod \"4d97553d-eed6-464e-ac05-89e3591b8bb0\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.551858 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-config-data\") pod \"4d97553d-eed6-464e-ac05-89e3591b8bb0\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.552061 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-fernet-keys\") pod \"4d97553d-eed6-464e-ac05-89e3591b8bb0\" (UID: \"4d97553d-eed6-464e-ac05-89e3591b8bb0\") " Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.557774 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4d97553d-eed6-464e-ac05-89e3591b8bb0" (UID: "4d97553d-eed6-464e-ac05-89e3591b8bb0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.557996 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d97553d-eed6-464e-ac05-89e3591b8bb0-kube-api-access-qdldw" (OuterVolumeSpecName: "kube-api-access-qdldw") pod "4d97553d-eed6-464e-ac05-89e3591b8bb0" (UID: "4d97553d-eed6-464e-ac05-89e3591b8bb0"). InnerVolumeSpecName "kube-api-access-qdldw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.581441 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d97553d-eed6-464e-ac05-89e3591b8bb0" (UID: "4d97553d-eed6-464e-ac05-89e3591b8bb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.610927 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-config-data" (OuterVolumeSpecName: "config-data") pod "4d97553d-eed6-464e-ac05-89e3591b8bb0" (UID: "4d97553d-eed6-464e-ac05-89e3591b8bb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.654925 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.654963 4964 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.654974 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d97553d-eed6-464e-ac05-89e3591b8bb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.654990 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdldw\" (UniqueName: \"kubernetes.io/projected/4d97553d-eed6-464e-ac05-89e3591b8bb0-kube-api-access-qdldw\") on node \"crc\" DevicePath \"\"" Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.985272 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325781-z42x5" event={"ID":"4d97553d-eed6-464e-ac05-89e3591b8bb0","Type":"ContainerDied","Data":"6e4483132a7e957f0c0c965012c4089ad9c43ed24d300b12303d5784bcf14ad1"} Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.985653 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e4483132a7e957f0c0c965012c4089ad9c43ed24d300b12303d5784bcf14ad1" Oct 04 03:01:05 crc kubenswrapper[4964]: I1004 03:01:05.985322 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325781-z42x5" Oct 04 03:01:34 crc kubenswrapper[4964]: I1004 03:01:34.449186 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:01:34 crc kubenswrapper[4964]: I1004 03:01:34.449745 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:02:04 crc kubenswrapper[4964]: I1004 03:02:04.448739 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:02:04 crc kubenswrapper[4964]: I1004 03:02:04.449312 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:02:25 crc kubenswrapper[4964]: I1004 03:02:25.377486 4964 scope.go:117] "RemoveContainer" containerID="4de5a7142f94763e41aa722d6cd5f5f2dc190940355fb1556cf9e5ad259c9b29" Oct 04 03:02:25 crc kubenswrapper[4964]: I1004 03:02:25.424553 4964 scope.go:117] "RemoveContainer" containerID="e046e921c74c4d4ff280cc1bc9017c225b68c61f6c094fd319f351fcb4c8ccee" Oct 04 03:02:34 crc kubenswrapper[4964]: I1004 03:02:34.448757 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:02:34 crc kubenswrapper[4964]: I1004 03:02:34.449289 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:02:34 crc kubenswrapper[4964]: I1004 03:02:34.449346 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 03:02:34 crc kubenswrapper[4964]: I1004 03:02:34.450024 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ac79a9f0f6a28341a809172243b68b1f44d642d4092bfec8fae089b115215b9"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 03:02:34 crc kubenswrapper[4964]: I1004 03:02:34.450078 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://8ac79a9f0f6a28341a809172243b68b1f44d642d4092bfec8fae089b115215b9" gracePeriod=600 Oct 04 03:02:34 crc kubenswrapper[4964]: I1004 03:02:34.972904 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="8ac79a9f0f6a28341a809172243b68b1f44d642d4092bfec8fae089b115215b9" exitCode=0 Oct 04 03:02:34 crc kubenswrapper[4964]: I1004 03:02:34.973046 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"8ac79a9f0f6a28341a809172243b68b1f44d642d4092bfec8fae089b115215b9"} Oct 04 03:02:34 crc kubenswrapper[4964]: I1004 03:02:34.973341 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16"} Oct 04 03:02:34 crc kubenswrapper[4964]: I1004 03:02:34.973363 4964 scope.go:117] "RemoveContainer" containerID="be21f7ce532c3d058512254656f890799806a56eaaee57c83d963d0c90820139" Oct 04 03:03:25 crc kubenswrapper[4964]: I1004 03:03:25.578400 4964 scope.go:117] "RemoveContainer" containerID="55e60088f904e46be4b03ea661defb4b742408eb226fc01bc979281f208fb107" Oct 04 03:03:25 crc kubenswrapper[4964]: I1004 03:03:25.610324 4964 scope.go:117] "RemoveContainer" containerID="00aa6bd9d7a2c0272b6fe574f963789ae848ddbd2238276dc716ac6519baa521" Oct 04 03:03:25 crc kubenswrapper[4964]: I1004 03:03:25.642562 4964 scope.go:117] "RemoveContainer" containerID="081a9a611599be9376db7c437a86112cc79f8eafe0f55048de6ae34d8e5f3f0f" Oct 04 03:03:25 crc kubenswrapper[4964]: I1004 03:03:25.708090 4964 scope.go:117] "RemoveContainer" containerID="4e914fdd841b439ed4574800421d4923941a5c357d0ebcaad3d08e9e4a9e3059" Oct 04 03:03:25 crc kubenswrapper[4964]: I1004 03:03:25.749465 4964 scope.go:117] "RemoveContainer" containerID="3f6ad1590cdafd238253a80d892b42c7c494258e10f46bb4176ecb5113422e78" Oct 04 03:03:50 crc kubenswrapper[4964]: I1004 03:03:50.840761 4964 generic.go:334] "Generic (PLEG): container finished" podID="01f3f08c-cbee-4b5a-9f06-60c9bf2978c4" containerID="fc93f8897872fbc7302743cac1691820166790a59b5676def40e260ba6fe729f" exitCode=0 Oct 04 03:03:50 crc kubenswrapper[4964]: I1004 03:03:50.840932 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" event={"ID":"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4","Type":"ContainerDied","Data":"fc93f8897872fbc7302743cac1691820166790a59b5676def40e260ba6fe729f"} Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.303378 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.421506 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-ssh-key\") pod \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.421639 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-bootstrap-combined-ca-bundle\") pod \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.421689 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvmdq\" (UniqueName: \"kubernetes.io/projected/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-kube-api-access-wvmdq\") pod \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.421740 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-inventory\") pod \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\" (UID: \"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4\") " Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.428327 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "01f3f08c-cbee-4b5a-9f06-60c9bf2978c4" (UID: "01f3f08c-cbee-4b5a-9f06-60c9bf2978c4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.439577 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-kube-api-access-wvmdq" (OuterVolumeSpecName: "kube-api-access-wvmdq") pod "01f3f08c-cbee-4b5a-9f06-60c9bf2978c4" (UID: "01f3f08c-cbee-4b5a-9f06-60c9bf2978c4"). InnerVolumeSpecName "kube-api-access-wvmdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.468199 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-inventory" (OuterVolumeSpecName: "inventory") pod "01f3f08c-cbee-4b5a-9f06-60c9bf2978c4" (UID: "01f3f08c-cbee-4b5a-9f06-60c9bf2978c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.475258 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "01f3f08c-cbee-4b5a-9f06-60c9bf2978c4" (UID: "01f3f08c-cbee-4b5a-9f06-60c9bf2978c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.523487 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvmdq\" (UniqueName: \"kubernetes.io/projected/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-kube-api-access-wvmdq\") on node \"crc\" DevicePath \"\"" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.523535 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.523555 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.523574 4964 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.866958 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" event={"ID":"01f3f08c-cbee-4b5a-9f06-60c9bf2978c4","Type":"ContainerDied","Data":"53cefb04b751d4427e944166e07ef7b01b407a1a281ffa0a7b680062be1e35b2"} Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.867070 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53cefb04b751d4427e944166e07ef7b01b407a1a281ffa0a7b680062be1e35b2" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.866990 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.974447 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh"] Oct 04 03:03:52 crc kubenswrapper[4964]: E1004 03:03:52.975180 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d97553d-eed6-464e-ac05-89e3591b8bb0" containerName="keystone-cron" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.975269 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d97553d-eed6-464e-ac05-89e3591b8bb0" containerName="keystone-cron" Oct 04 03:03:52 crc kubenswrapper[4964]: E1004 03:03:52.975388 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f3f08c-cbee-4b5a-9f06-60c9bf2978c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.975462 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f3f08c-cbee-4b5a-9f06-60c9bf2978c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.975765 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d97553d-eed6-464e-ac05-89e3591b8bb0" containerName="keystone-cron" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.975859 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f3f08c-cbee-4b5a-9f06-60c9bf2978c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.976652 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.980254 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.980508 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.980650 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.982107 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:03:52 crc kubenswrapper[4964]: I1004 03:03:52.989569 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh"] Oct 04 03:03:53 crc kubenswrapper[4964]: I1004 03:03:53.138246 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2dba0c3e-e5d7-4b39-a2c6-403a93020983-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh\" (UID: \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:03:53 crc kubenswrapper[4964]: I1004 03:03:53.138410 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdhn8\" (UniqueName: \"kubernetes.io/projected/2dba0c3e-e5d7-4b39-a2c6-403a93020983-kube-api-access-rdhn8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh\" (UID: \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:03:53 crc kubenswrapper[4964]: I1004 03:03:53.138489 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2dba0c3e-e5d7-4b39-a2c6-403a93020983-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh\" (UID: \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:03:53 crc kubenswrapper[4964]: I1004 03:03:53.239733 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2dba0c3e-e5d7-4b39-a2c6-403a93020983-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh\" (UID: \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:03:53 crc kubenswrapper[4964]: I1004 03:03:53.239870 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdhn8\" (UniqueName: \"kubernetes.io/projected/2dba0c3e-e5d7-4b39-a2c6-403a93020983-kube-api-access-rdhn8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh\" (UID: \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:03:53 crc kubenswrapper[4964]: I1004 03:03:53.239922 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2dba0c3e-e5d7-4b39-a2c6-403a93020983-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh\" (UID: \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:03:53 crc kubenswrapper[4964]: I1004 03:03:53.245454 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2dba0c3e-e5d7-4b39-a2c6-403a93020983-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh\" (UID: \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:03:53 crc kubenswrapper[4964]: I1004 03:03:53.247259 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2dba0c3e-e5d7-4b39-a2c6-403a93020983-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh\" (UID: \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:03:53 crc kubenswrapper[4964]: I1004 03:03:53.261778 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdhn8\" (UniqueName: \"kubernetes.io/projected/2dba0c3e-e5d7-4b39-a2c6-403a93020983-kube-api-access-rdhn8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh\" (UID: \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:03:53 crc kubenswrapper[4964]: I1004 03:03:53.307907 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:03:53 crc kubenswrapper[4964]: I1004 03:03:53.869674 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh"] Oct 04 03:03:54 crc kubenswrapper[4964]: I1004 03:03:54.893300 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" event={"ID":"2dba0c3e-e5d7-4b39-a2c6-403a93020983","Type":"ContainerStarted","Data":"2f464a09fcb93ec8c7bf0719420802609ee2b7a6c1c66d8d15f0deccabd86bd6"} Oct 04 03:03:54 crc kubenswrapper[4964]: I1004 03:03:54.893681 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" event={"ID":"2dba0c3e-e5d7-4b39-a2c6-403a93020983","Type":"ContainerStarted","Data":"6c7f79a60eb7d207b2c2f70fdd526202490b6404ef60a0de9f7de340152d3c42"} Oct 04 03:03:54 crc kubenswrapper[4964]: I1004 03:03:54.912734 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" podStartSLOduration=2.448865763 podStartE2EDuration="2.912717852s" podCreationTimestamp="2025-10-04 03:03:52 +0000 UTC" firstStartedPulling="2025-10-04 03:03:53.877562866 +0000 UTC m=+1413.774521504" lastFinishedPulling="2025-10-04 03:03:54.341414925 +0000 UTC m=+1414.238373593" observedRunningTime="2025-10-04 03:03:54.910837672 +0000 UTC m=+1414.807796320" watchObservedRunningTime="2025-10-04 03:03:54.912717852 +0000 UTC m=+1414.809676490" Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.128360 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9j97c"] Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.133219 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.142579 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9j97c"] Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.326549 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-utilities\") pod \"redhat-marketplace-9j97c\" (UID: \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\") " pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.326718 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj5sr\" (UniqueName: \"kubernetes.io/projected/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-kube-api-access-qj5sr\") pod \"redhat-marketplace-9j97c\" (UID: \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\") " pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.326805 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-catalog-content\") pod \"redhat-marketplace-9j97c\" (UID: \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\") " pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.428985 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj5sr\" (UniqueName: \"kubernetes.io/projected/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-kube-api-access-qj5sr\") pod \"redhat-marketplace-9j97c\" (UID: \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\") " pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.429093 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-catalog-content\") pod \"redhat-marketplace-9j97c\" (UID: \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\") " pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.429214 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-utilities\") pod \"redhat-marketplace-9j97c\" (UID: \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\") " pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.429720 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-catalog-content\") pod \"redhat-marketplace-9j97c\" (UID: \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\") " pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.429767 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-utilities\") pod \"redhat-marketplace-9j97c\" (UID: \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\") " pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.452494 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj5sr\" (UniqueName: \"kubernetes.io/projected/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-kube-api-access-qj5sr\") pod \"redhat-marketplace-9j97c\" (UID: \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\") " pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.463637 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:16 crc kubenswrapper[4964]: I1004 03:04:16.778690 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9j97c"] Oct 04 03:04:17 crc kubenswrapper[4964]: I1004 03:04:17.138217 4964 generic.go:334] "Generic (PLEG): container finished" podID="bc63adc8-c60c-4868-8555-bc1c9b8de0e3" containerID="01634f34efb3ca2ead0aea8a15cffd64decfb8e4b127f0865bedc63056f35c46" exitCode=0 Oct 04 03:04:17 crc kubenswrapper[4964]: I1004 03:04:17.138293 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j97c" event={"ID":"bc63adc8-c60c-4868-8555-bc1c9b8de0e3","Type":"ContainerDied","Data":"01634f34efb3ca2ead0aea8a15cffd64decfb8e4b127f0865bedc63056f35c46"} Oct 04 03:04:17 crc kubenswrapper[4964]: I1004 03:04:17.138541 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j97c" event={"ID":"bc63adc8-c60c-4868-8555-bc1c9b8de0e3","Type":"ContainerStarted","Data":"de2ad5af220f4b6dc59ffd6490e365ec44b10f4a37665764752cbb332d630a14"} Oct 04 03:04:18 crc kubenswrapper[4964]: I1004 03:04:18.148967 4964 generic.go:334] "Generic (PLEG): container finished" podID="bc63adc8-c60c-4868-8555-bc1c9b8de0e3" containerID="036a98cfa681aa2b0b6657dc8e2adac4d741d33217d911ed6dd8309944caa786" exitCode=0 Oct 04 03:04:18 crc kubenswrapper[4964]: I1004 03:04:18.149273 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j97c" event={"ID":"bc63adc8-c60c-4868-8555-bc1c9b8de0e3","Type":"ContainerDied","Data":"036a98cfa681aa2b0b6657dc8e2adac4d741d33217d911ed6dd8309944caa786"} Oct 04 03:04:19 crc kubenswrapper[4964]: I1004 03:04:19.163599 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j97c" event={"ID":"bc63adc8-c60c-4868-8555-bc1c9b8de0e3","Type":"ContainerStarted","Data":"87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77"} Oct 04 03:04:19 crc kubenswrapper[4964]: I1004 03:04:19.185657 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9j97c" podStartSLOduration=1.686763993 podStartE2EDuration="3.185638663s" podCreationTimestamp="2025-10-04 03:04:16 +0000 UTC" firstStartedPulling="2025-10-04 03:04:17.140314387 +0000 UTC m=+1437.037273065" lastFinishedPulling="2025-10-04 03:04:18.639189097 +0000 UTC m=+1438.536147735" observedRunningTime="2025-10-04 03:04:19.180695262 +0000 UTC m=+1439.077653910" watchObservedRunningTime="2025-10-04 03:04:19.185638663 +0000 UTC m=+1439.082597301" Oct 04 03:04:25 crc kubenswrapper[4964]: I1004 03:04:25.860874 4964 scope.go:117] "RemoveContainer" containerID="8f0d133e400581fc931972f43c7d0764d9d640fac6970ab9e6efc9740254826f" Oct 04 03:04:25 crc kubenswrapper[4964]: I1004 03:04:25.891882 4964 scope.go:117] "RemoveContainer" containerID="e79530df5d637bd704985cf9dc5575877412ba4095642b70520b7dfe200658e5" Oct 04 03:04:25 crc kubenswrapper[4964]: I1004 03:04:25.920326 4964 scope.go:117] "RemoveContainer" containerID="17f26b5f3585a9f8e60d48a8e9abbfdb3c67917c3b42841e8251bed80d7c5380" Oct 04 03:04:26 crc kubenswrapper[4964]: I1004 03:04:26.464341 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:26 crc kubenswrapper[4964]: I1004 03:04:26.465925 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:26 crc kubenswrapper[4964]: I1004 03:04:26.549283 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:27 crc kubenswrapper[4964]: I1004 03:04:27.327773 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:27 crc kubenswrapper[4964]: I1004 03:04:27.384601 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9j97c"] Oct 04 03:04:29 crc kubenswrapper[4964]: I1004 03:04:29.306204 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9j97c" podUID="bc63adc8-c60c-4868-8555-bc1c9b8de0e3" containerName="registry-server" containerID="cri-o://87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77" gracePeriod=2 Oct 04 03:04:29 crc kubenswrapper[4964]: I1004 03:04:29.749546 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:29 crc kubenswrapper[4964]: I1004 03:04:29.817715 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-utilities\") pod \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\" (UID: \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\") " Oct 04 03:04:29 crc kubenswrapper[4964]: I1004 03:04:29.817773 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-catalog-content\") pod \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\" (UID: \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\") " Oct 04 03:04:29 crc kubenswrapper[4964]: I1004 03:04:29.817886 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj5sr\" (UniqueName: \"kubernetes.io/projected/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-kube-api-access-qj5sr\") pod \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\" (UID: \"bc63adc8-c60c-4868-8555-bc1c9b8de0e3\") " Oct 04 03:04:29 crc kubenswrapper[4964]: I1004 03:04:29.819514 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-utilities" (OuterVolumeSpecName: "utilities") pod "bc63adc8-c60c-4868-8555-bc1c9b8de0e3" (UID: "bc63adc8-c60c-4868-8555-bc1c9b8de0e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:04:29 crc kubenswrapper[4964]: I1004 03:04:29.824145 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-kube-api-access-qj5sr" (OuterVolumeSpecName: "kube-api-access-qj5sr") pod "bc63adc8-c60c-4868-8555-bc1c9b8de0e3" (UID: "bc63adc8-c60c-4868-8555-bc1c9b8de0e3"). InnerVolumeSpecName "kube-api-access-qj5sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:04:29 crc kubenswrapper[4964]: I1004 03:04:29.836731 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc63adc8-c60c-4868-8555-bc1c9b8de0e3" (UID: "bc63adc8-c60c-4868-8555-bc1c9b8de0e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:04:29 crc kubenswrapper[4964]: I1004 03:04:29.919945 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:04:29 crc kubenswrapper[4964]: I1004 03:04:29.919979 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:04:29 crc kubenswrapper[4964]: I1004 03:04:29.919994 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj5sr\" (UniqueName: \"kubernetes.io/projected/bc63adc8-c60c-4868-8555-bc1c9b8de0e3-kube-api-access-qj5sr\") on node \"crc\" DevicePath \"\"" Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.340181 4964 generic.go:334] "Generic (PLEG): container finished" podID="bc63adc8-c60c-4868-8555-bc1c9b8de0e3" containerID="87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77" exitCode=0 Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.340248 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j97c" event={"ID":"bc63adc8-c60c-4868-8555-bc1c9b8de0e3","Type":"ContainerDied","Data":"87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77"} Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.340289 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j97c" event={"ID":"bc63adc8-c60c-4868-8555-bc1c9b8de0e3","Type":"ContainerDied","Data":"de2ad5af220f4b6dc59ffd6490e365ec44b10f4a37665764752cbb332d630a14"} Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.340320 4964 scope.go:117] "RemoveContainer" containerID="87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77" Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.340504 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9j97c" Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.372608 4964 scope.go:117] "RemoveContainer" containerID="036a98cfa681aa2b0b6657dc8e2adac4d741d33217d911ed6dd8309944caa786" Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.412543 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9j97c"] Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.426416 4964 scope.go:117] "RemoveContainer" containerID="01634f34efb3ca2ead0aea8a15cffd64decfb8e4b127f0865bedc63056f35c46" Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.429436 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9j97c"] Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.450153 4964 scope.go:117] "RemoveContainer" containerID="87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77" Oct 04 03:04:30 crc kubenswrapper[4964]: E1004 03:04:30.450705 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77\": container with ID starting with 87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77 not found: ID does not exist" containerID="87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77" Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.450741 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77"} err="failed to get container status \"87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77\": rpc error: code = NotFound desc = could not find container \"87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77\": container with ID starting with 87599bc535fef82bdc7a07deefe57e642042eabe1894b7222ac5258c48dfef77 not found: ID does not exist" Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.450767 4964 scope.go:117] "RemoveContainer" containerID="036a98cfa681aa2b0b6657dc8e2adac4d741d33217d911ed6dd8309944caa786" Oct 04 03:04:30 crc kubenswrapper[4964]: E1004 03:04:30.451377 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"036a98cfa681aa2b0b6657dc8e2adac4d741d33217d911ed6dd8309944caa786\": container with ID starting with 036a98cfa681aa2b0b6657dc8e2adac4d741d33217d911ed6dd8309944caa786 not found: ID does not exist" containerID="036a98cfa681aa2b0b6657dc8e2adac4d741d33217d911ed6dd8309944caa786" Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.451464 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"036a98cfa681aa2b0b6657dc8e2adac4d741d33217d911ed6dd8309944caa786"} err="failed to get container status \"036a98cfa681aa2b0b6657dc8e2adac4d741d33217d911ed6dd8309944caa786\": rpc error: code = NotFound desc = could not find container \"036a98cfa681aa2b0b6657dc8e2adac4d741d33217d911ed6dd8309944caa786\": container with ID starting with 036a98cfa681aa2b0b6657dc8e2adac4d741d33217d911ed6dd8309944caa786 not found: ID does not exist" Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.451519 4964 scope.go:117] "RemoveContainer" containerID="01634f34efb3ca2ead0aea8a15cffd64decfb8e4b127f0865bedc63056f35c46" Oct 04 03:04:30 crc kubenswrapper[4964]: E1004 03:04:30.451997 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01634f34efb3ca2ead0aea8a15cffd64decfb8e4b127f0865bedc63056f35c46\": container with ID starting with 01634f34efb3ca2ead0aea8a15cffd64decfb8e4b127f0865bedc63056f35c46 not found: ID does not exist" containerID="01634f34efb3ca2ead0aea8a15cffd64decfb8e4b127f0865bedc63056f35c46" Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.452028 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01634f34efb3ca2ead0aea8a15cffd64decfb8e4b127f0865bedc63056f35c46"} err="failed to get container status \"01634f34efb3ca2ead0aea8a15cffd64decfb8e4b127f0865bedc63056f35c46\": rpc error: code = NotFound desc = could not find container \"01634f34efb3ca2ead0aea8a15cffd64decfb8e4b127f0865bedc63056f35c46\": container with ID starting with 01634f34efb3ca2ead0aea8a15cffd64decfb8e4b127f0865bedc63056f35c46 not found: ID does not exist" Oct 04 03:04:30 crc kubenswrapper[4964]: I1004 03:04:30.855923 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc63adc8-c60c-4868-8555-bc1c9b8de0e3" path="/var/lib/kubelet/pods/bc63adc8-c60c-4868-8555-bc1c9b8de0e3/volumes" Oct 04 03:04:33 crc kubenswrapper[4964]: I1004 03:04:33.969204 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r8k7g"] Oct 04 03:04:33 crc kubenswrapper[4964]: E1004 03:04:33.970835 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc63adc8-c60c-4868-8555-bc1c9b8de0e3" containerName="registry-server" Oct 04 03:04:33 crc kubenswrapper[4964]: I1004 03:04:33.970873 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc63adc8-c60c-4868-8555-bc1c9b8de0e3" containerName="registry-server" Oct 04 03:04:33 crc kubenswrapper[4964]: E1004 03:04:33.970933 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc63adc8-c60c-4868-8555-bc1c9b8de0e3" containerName="extract-utilities" Oct 04 03:04:33 crc kubenswrapper[4964]: I1004 03:04:33.970953 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc63adc8-c60c-4868-8555-bc1c9b8de0e3" containerName="extract-utilities" Oct 04 03:04:33 crc kubenswrapper[4964]: E1004 03:04:33.971010 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc63adc8-c60c-4868-8555-bc1c9b8de0e3" containerName="extract-content" Oct 04 03:04:33 crc kubenswrapper[4964]: I1004 03:04:33.971028 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc63adc8-c60c-4868-8555-bc1c9b8de0e3" containerName="extract-content" Oct 04 03:04:33 crc kubenswrapper[4964]: I1004 03:04:33.971485 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc63adc8-c60c-4868-8555-bc1c9b8de0e3" containerName="registry-server" Oct 04 03:04:33 crc kubenswrapper[4964]: I1004 03:04:33.974341 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:33 crc kubenswrapper[4964]: I1004 03:04:33.983393 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8k7g"] Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.014300 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwt5\" (UniqueName: \"kubernetes.io/projected/be32f6cf-d966-409d-ae31-eac3e6279d31-kube-api-access-npwt5\") pod \"community-operators-r8k7g\" (UID: \"be32f6cf-d966-409d-ae31-eac3e6279d31\") " pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.014454 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be32f6cf-d966-409d-ae31-eac3e6279d31-catalog-content\") pod \"community-operators-r8k7g\" (UID: \"be32f6cf-d966-409d-ae31-eac3e6279d31\") " pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.014850 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be32f6cf-d966-409d-ae31-eac3e6279d31-utilities\") pod \"community-operators-r8k7g\" (UID: \"be32f6cf-d966-409d-ae31-eac3e6279d31\") " pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.117084 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be32f6cf-d966-409d-ae31-eac3e6279d31-utilities\") pod \"community-operators-r8k7g\" (UID: \"be32f6cf-d966-409d-ae31-eac3e6279d31\") " pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.117249 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npwt5\" (UniqueName: \"kubernetes.io/projected/be32f6cf-d966-409d-ae31-eac3e6279d31-kube-api-access-npwt5\") pod \"community-operators-r8k7g\" (UID: \"be32f6cf-d966-409d-ae31-eac3e6279d31\") " pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.117388 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be32f6cf-d966-409d-ae31-eac3e6279d31-catalog-content\") pod \"community-operators-r8k7g\" (UID: \"be32f6cf-d966-409d-ae31-eac3e6279d31\") " pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.118062 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be32f6cf-d966-409d-ae31-eac3e6279d31-utilities\") pod \"community-operators-r8k7g\" (UID: \"be32f6cf-d966-409d-ae31-eac3e6279d31\") " pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.118202 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be32f6cf-d966-409d-ae31-eac3e6279d31-catalog-content\") pod \"community-operators-r8k7g\" (UID: \"be32f6cf-d966-409d-ae31-eac3e6279d31\") " pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.150785 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwt5\" (UniqueName: \"kubernetes.io/projected/be32f6cf-d966-409d-ae31-eac3e6279d31-kube-api-access-npwt5\") pod \"community-operators-r8k7g\" (UID: \"be32f6cf-d966-409d-ae31-eac3e6279d31\") " pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.320063 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.449422 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.449499 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:04:34 crc kubenswrapper[4964]: I1004 03:04:34.651261 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8k7g"] Oct 04 03:04:35 crc kubenswrapper[4964]: I1004 03:04:35.405849 4964 generic.go:334] "Generic (PLEG): container finished" podID="be32f6cf-d966-409d-ae31-eac3e6279d31" containerID="558607671b96cb982fa9cd66993d6fb4cbed468b380974148e60d04693e48bff" exitCode=0 Oct 04 03:04:35 crc kubenswrapper[4964]: I1004 03:04:35.405981 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8k7g" event={"ID":"be32f6cf-d966-409d-ae31-eac3e6279d31","Type":"ContainerDied","Data":"558607671b96cb982fa9cd66993d6fb4cbed468b380974148e60d04693e48bff"} Oct 04 03:04:35 crc kubenswrapper[4964]: I1004 03:04:35.406398 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8k7g" event={"ID":"be32f6cf-d966-409d-ae31-eac3e6279d31","Type":"ContainerStarted","Data":"c7d2b69b133b31454615f9a5978eb7c0cbf0d795c089d00c6e373f43d661dffb"} Oct 04 03:04:36 crc kubenswrapper[4964]: I1004 03:04:36.418003 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8k7g" event={"ID":"be32f6cf-d966-409d-ae31-eac3e6279d31","Type":"ContainerStarted","Data":"fd0f2e719152d6e783c8f4a9cbf14259b627cba1fcc608814b79379eda95fd9f"} Oct 04 03:04:37 crc kubenswrapper[4964]: I1004 03:04:37.432756 4964 generic.go:334] "Generic (PLEG): container finished" podID="be32f6cf-d966-409d-ae31-eac3e6279d31" containerID="fd0f2e719152d6e783c8f4a9cbf14259b627cba1fcc608814b79379eda95fd9f" exitCode=0 Oct 04 03:04:37 crc kubenswrapper[4964]: I1004 03:04:37.432823 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8k7g" event={"ID":"be32f6cf-d966-409d-ae31-eac3e6279d31","Type":"ContainerDied","Data":"fd0f2e719152d6e783c8f4a9cbf14259b627cba1fcc608814b79379eda95fd9f"} Oct 04 03:04:38 crc kubenswrapper[4964]: I1004 03:04:38.483334 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8k7g" event={"ID":"be32f6cf-d966-409d-ae31-eac3e6279d31","Type":"ContainerStarted","Data":"eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a"} Oct 04 03:04:38 crc kubenswrapper[4964]: I1004 03:04:38.509286 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r8k7g" podStartSLOduration=3.040217124 podStartE2EDuration="5.509265072s" podCreationTimestamp="2025-10-04 03:04:33 +0000 UTC" firstStartedPulling="2025-10-04 03:04:35.40848241 +0000 UTC m=+1455.305441058" lastFinishedPulling="2025-10-04 03:04:37.877530328 +0000 UTC m=+1457.774489006" observedRunningTime="2025-10-04 03:04:38.50539069 +0000 UTC m=+1458.402349328" watchObservedRunningTime="2025-10-04 03:04:38.509265072 +0000 UTC m=+1458.406223750" Oct 04 03:04:44 crc kubenswrapper[4964]: I1004 03:04:44.320750 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:44 crc kubenswrapper[4964]: I1004 03:04:44.321180 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:44 crc kubenswrapper[4964]: I1004 03:04:44.400395 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:44 crc kubenswrapper[4964]: I1004 03:04:44.634864 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:44 crc kubenswrapper[4964]: I1004 03:04:44.720985 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8k7g"] Oct 04 03:04:46 crc kubenswrapper[4964]: I1004 03:04:46.573667 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r8k7g" podUID="be32f6cf-d966-409d-ae31-eac3e6279d31" containerName="registry-server" containerID="cri-o://eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a" gracePeriod=2 Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.072567 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.208092 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be32f6cf-d966-409d-ae31-eac3e6279d31-catalog-content\") pod \"be32f6cf-d966-409d-ae31-eac3e6279d31\" (UID: \"be32f6cf-d966-409d-ae31-eac3e6279d31\") " Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.208249 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be32f6cf-d966-409d-ae31-eac3e6279d31-utilities\") pod \"be32f6cf-d966-409d-ae31-eac3e6279d31\" (UID: \"be32f6cf-d966-409d-ae31-eac3e6279d31\") " Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.208304 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npwt5\" (UniqueName: \"kubernetes.io/projected/be32f6cf-d966-409d-ae31-eac3e6279d31-kube-api-access-npwt5\") pod \"be32f6cf-d966-409d-ae31-eac3e6279d31\" (UID: \"be32f6cf-d966-409d-ae31-eac3e6279d31\") " Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.209582 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be32f6cf-d966-409d-ae31-eac3e6279d31-utilities" (OuterVolumeSpecName: "utilities") pod "be32f6cf-d966-409d-ae31-eac3e6279d31" (UID: "be32f6cf-d966-409d-ae31-eac3e6279d31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.216039 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be32f6cf-d966-409d-ae31-eac3e6279d31-kube-api-access-npwt5" (OuterVolumeSpecName: "kube-api-access-npwt5") pod "be32f6cf-d966-409d-ae31-eac3e6279d31" (UID: "be32f6cf-d966-409d-ae31-eac3e6279d31"). InnerVolumeSpecName "kube-api-access-npwt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.259725 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be32f6cf-d966-409d-ae31-eac3e6279d31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be32f6cf-d966-409d-ae31-eac3e6279d31" (UID: "be32f6cf-d966-409d-ae31-eac3e6279d31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.310603 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be32f6cf-d966-409d-ae31-eac3e6279d31-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.310666 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be32f6cf-d966-409d-ae31-eac3e6279d31-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.310685 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npwt5\" (UniqueName: \"kubernetes.io/projected/be32f6cf-d966-409d-ae31-eac3e6279d31-kube-api-access-npwt5\") on node \"crc\" DevicePath \"\"" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.587847 4964 generic.go:334] "Generic (PLEG): container finished" podID="be32f6cf-d966-409d-ae31-eac3e6279d31" containerID="eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a" exitCode=0 Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.587913 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8k7g" event={"ID":"be32f6cf-d966-409d-ae31-eac3e6279d31","Type":"ContainerDied","Data":"eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a"} Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.587948 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8k7g" event={"ID":"be32f6cf-d966-409d-ae31-eac3e6279d31","Type":"ContainerDied","Data":"c7d2b69b133b31454615f9a5978eb7c0cbf0d795c089d00c6e373f43d661dffb"} Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.587974 4964 scope.go:117] "RemoveContainer" containerID="eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.588035 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8k7g" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.615341 4964 scope.go:117] "RemoveContainer" containerID="fd0f2e719152d6e783c8f4a9cbf14259b627cba1fcc608814b79379eda95fd9f" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.645345 4964 scope.go:117] "RemoveContainer" containerID="558607671b96cb982fa9cd66993d6fb4cbed468b380974148e60d04693e48bff" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.714837 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8k7g"] Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.730358 4964 scope.go:117] "RemoveContainer" containerID="eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.732012 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r8k7g"] Oct 04 03:04:47 crc kubenswrapper[4964]: E1004 03:04:47.732990 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a\": container with ID starting with eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a not found: ID does not exist" containerID="eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.733044 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a"} err="failed to get container status \"eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a\": rpc error: code = NotFound desc = could not find container \"eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a\": container with ID starting with eb34f8b4e5ebe59a8085a9922cc78bad2cfb60ab0212df57da9b5e6e2d9bd79a not found: ID does not exist" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.733071 4964 scope.go:117] "RemoveContainer" containerID="fd0f2e719152d6e783c8f4a9cbf14259b627cba1fcc608814b79379eda95fd9f" Oct 04 03:04:47 crc kubenswrapper[4964]: E1004 03:04:47.733562 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0f2e719152d6e783c8f4a9cbf14259b627cba1fcc608814b79379eda95fd9f\": container with ID starting with fd0f2e719152d6e783c8f4a9cbf14259b627cba1fcc608814b79379eda95fd9f not found: ID does not exist" containerID="fd0f2e719152d6e783c8f4a9cbf14259b627cba1fcc608814b79379eda95fd9f" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.733608 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0f2e719152d6e783c8f4a9cbf14259b627cba1fcc608814b79379eda95fd9f"} err="failed to get container status \"fd0f2e719152d6e783c8f4a9cbf14259b627cba1fcc608814b79379eda95fd9f\": rpc error: code = NotFound desc = could not find container \"fd0f2e719152d6e783c8f4a9cbf14259b627cba1fcc608814b79379eda95fd9f\": container with ID starting with fd0f2e719152d6e783c8f4a9cbf14259b627cba1fcc608814b79379eda95fd9f not found: ID does not exist" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.733665 4964 scope.go:117] "RemoveContainer" containerID="558607671b96cb982fa9cd66993d6fb4cbed468b380974148e60d04693e48bff" Oct 04 03:04:47 crc kubenswrapper[4964]: E1004 03:04:47.733937 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"558607671b96cb982fa9cd66993d6fb4cbed468b380974148e60d04693e48bff\": container with ID starting with 558607671b96cb982fa9cd66993d6fb4cbed468b380974148e60d04693e48bff not found: ID does not exist" containerID="558607671b96cb982fa9cd66993d6fb4cbed468b380974148e60d04693e48bff" Oct 04 03:04:47 crc kubenswrapper[4964]: I1004 03:04:47.733966 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"558607671b96cb982fa9cd66993d6fb4cbed468b380974148e60d04693e48bff"} err="failed to get container status \"558607671b96cb982fa9cd66993d6fb4cbed468b380974148e60d04693e48bff\": rpc error: code = NotFound desc = could not find container \"558607671b96cb982fa9cd66993d6fb4cbed468b380974148e60d04693e48bff\": container with ID starting with 558607671b96cb982fa9cd66993d6fb4cbed468b380974148e60d04693e48bff not found: ID does not exist" Oct 04 03:04:48 crc kubenswrapper[4964]: I1004 03:04:48.855915 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be32f6cf-d966-409d-ae31-eac3e6279d31" path="/var/lib/kubelet/pods/be32f6cf-d966-409d-ae31-eac3e6279d31/volumes" Oct 04 03:05:04 crc kubenswrapper[4964]: I1004 03:05:04.449385 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:05:04 crc kubenswrapper[4964]: I1004 03:05:04.450234 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:05:05 crc kubenswrapper[4964]: I1004 03:05:05.798860 4964 generic.go:334] "Generic (PLEG): container finished" podID="2dba0c3e-e5d7-4b39-a2c6-403a93020983" containerID="2f464a09fcb93ec8c7bf0719420802609ee2b7a6c1c66d8d15f0deccabd86bd6" exitCode=0 Oct 04 03:05:05 crc kubenswrapper[4964]: I1004 03:05:05.798914 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" event={"ID":"2dba0c3e-e5d7-4b39-a2c6-403a93020983","Type":"ContainerDied","Data":"2f464a09fcb93ec8c7bf0719420802609ee2b7a6c1c66d8d15f0deccabd86bd6"} Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.371027 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.446816 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2dba0c3e-e5d7-4b39-a2c6-403a93020983-inventory\") pod \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\" (UID: \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\") " Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.447077 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdhn8\" (UniqueName: \"kubernetes.io/projected/2dba0c3e-e5d7-4b39-a2c6-403a93020983-kube-api-access-rdhn8\") pod \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\" (UID: \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\") " Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.447116 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2dba0c3e-e5d7-4b39-a2c6-403a93020983-ssh-key\") pod \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\" (UID: \"2dba0c3e-e5d7-4b39-a2c6-403a93020983\") " Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.453548 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dba0c3e-e5d7-4b39-a2c6-403a93020983-kube-api-access-rdhn8" (OuterVolumeSpecName: "kube-api-access-rdhn8") pod "2dba0c3e-e5d7-4b39-a2c6-403a93020983" (UID: "2dba0c3e-e5d7-4b39-a2c6-403a93020983"). InnerVolumeSpecName "kube-api-access-rdhn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.483086 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dba0c3e-e5d7-4b39-a2c6-403a93020983-inventory" (OuterVolumeSpecName: "inventory") pod "2dba0c3e-e5d7-4b39-a2c6-403a93020983" (UID: "2dba0c3e-e5d7-4b39-a2c6-403a93020983"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.493038 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dba0c3e-e5d7-4b39-a2c6-403a93020983-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2dba0c3e-e5d7-4b39-a2c6-403a93020983" (UID: "2dba0c3e-e5d7-4b39-a2c6-403a93020983"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.548702 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdhn8\" (UniqueName: \"kubernetes.io/projected/2dba0c3e-e5d7-4b39-a2c6-403a93020983-kube-api-access-rdhn8\") on node \"crc\" DevicePath \"\"" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.548736 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2dba0c3e-e5d7-4b39-a2c6-403a93020983-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.548746 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2dba0c3e-e5d7-4b39-a2c6-403a93020983-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.820173 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" event={"ID":"2dba0c3e-e5d7-4b39-a2c6-403a93020983","Type":"ContainerDied","Data":"6c7f79a60eb7d207b2c2f70fdd526202490b6404ef60a0de9f7de340152d3c42"} Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.820221 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c7f79a60eb7d207b2c2f70fdd526202490b6404ef60a0de9f7de340152d3c42" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.820295 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.916946 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh"] Oct 04 03:05:07 crc kubenswrapper[4964]: E1004 03:05:07.917531 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dba0c3e-e5d7-4b39-a2c6-403a93020983" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.917560 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dba0c3e-e5d7-4b39-a2c6-403a93020983" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 04 03:05:07 crc kubenswrapper[4964]: E1004 03:05:07.917598 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be32f6cf-d966-409d-ae31-eac3e6279d31" containerName="registry-server" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.917652 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="be32f6cf-d966-409d-ae31-eac3e6279d31" containerName="registry-server" Oct 04 03:05:07 crc kubenswrapper[4964]: E1004 03:05:07.917673 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be32f6cf-d966-409d-ae31-eac3e6279d31" containerName="extract-utilities" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.917687 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="be32f6cf-d966-409d-ae31-eac3e6279d31" containerName="extract-utilities" Oct 04 03:05:07 crc kubenswrapper[4964]: E1004 03:05:07.917720 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be32f6cf-d966-409d-ae31-eac3e6279d31" containerName="extract-content" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.917733 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="be32f6cf-d966-409d-ae31-eac3e6279d31" containerName="extract-content" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.918094 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dba0c3e-e5d7-4b39-a2c6-403a93020983" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.918151 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="be32f6cf-d966-409d-ae31-eac3e6279d31" containerName="registry-server" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.919184 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.926426 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.926467 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.926585 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.926898 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.927538 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh"] Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.955668 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3b6c804-8b89-4692-b6a2-54f606a700a8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh\" (UID: \"a3b6c804-8b89-4692-b6a2-54f606a700a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.955933 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc4x5\" (UniqueName: \"kubernetes.io/projected/a3b6c804-8b89-4692-b6a2-54f606a700a8-kube-api-access-bc4x5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh\" (UID: \"a3b6c804-8b89-4692-b6a2-54f606a700a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:07 crc kubenswrapper[4964]: I1004 03:05:07.956088 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3b6c804-8b89-4692-b6a2-54f606a700a8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh\" (UID: \"a3b6c804-8b89-4692-b6a2-54f606a700a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:08 crc kubenswrapper[4964]: I1004 03:05:08.058854 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3b6c804-8b89-4692-b6a2-54f606a700a8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh\" (UID: \"a3b6c804-8b89-4692-b6a2-54f606a700a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:08 crc kubenswrapper[4964]: I1004 03:05:08.059606 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc4x5\" (UniqueName: \"kubernetes.io/projected/a3b6c804-8b89-4692-b6a2-54f606a700a8-kube-api-access-bc4x5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh\" (UID: \"a3b6c804-8b89-4692-b6a2-54f606a700a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:08 crc kubenswrapper[4964]: I1004 03:05:08.059713 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3b6c804-8b89-4692-b6a2-54f606a700a8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh\" (UID: \"a3b6c804-8b89-4692-b6a2-54f606a700a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:08 crc kubenswrapper[4964]: I1004 03:05:08.064758 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3b6c804-8b89-4692-b6a2-54f606a700a8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh\" (UID: \"a3b6c804-8b89-4692-b6a2-54f606a700a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:08 crc kubenswrapper[4964]: I1004 03:05:08.065094 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3b6c804-8b89-4692-b6a2-54f606a700a8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh\" (UID: \"a3b6c804-8b89-4692-b6a2-54f606a700a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:08 crc kubenswrapper[4964]: I1004 03:05:08.088638 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc4x5\" (UniqueName: \"kubernetes.io/projected/a3b6c804-8b89-4692-b6a2-54f606a700a8-kube-api-access-bc4x5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh\" (UID: \"a3b6c804-8b89-4692-b6a2-54f606a700a8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:08 crc kubenswrapper[4964]: I1004 03:05:08.239355 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:08 crc kubenswrapper[4964]: I1004 03:05:08.608914 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh"] Oct 04 03:05:08 crc kubenswrapper[4964]: I1004 03:05:08.838605 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" event={"ID":"a3b6c804-8b89-4692-b6a2-54f606a700a8","Type":"ContainerStarted","Data":"ad4f2464a8470fab88c84cf8b32a6486613aa397e4e0566fcfe606958cb274d8"} Oct 04 03:05:09 crc kubenswrapper[4964]: I1004 03:05:09.856034 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" event={"ID":"a3b6c804-8b89-4692-b6a2-54f606a700a8","Type":"ContainerStarted","Data":"1321d6376d84230cda55e9a6943cc1f92e5653e51833ca64edb440b72491ee27"} Oct 04 03:05:09 crc kubenswrapper[4964]: I1004 03:05:09.883525 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" podStartSLOduration=2.348774038 podStartE2EDuration="2.883503272s" podCreationTimestamp="2025-10-04 03:05:07 +0000 UTC" firstStartedPulling="2025-10-04 03:05:08.623781394 +0000 UTC m=+1488.520740042" lastFinishedPulling="2025-10-04 03:05:09.158510628 +0000 UTC m=+1489.055469276" observedRunningTime="2025-10-04 03:05:09.876971643 +0000 UTC m=+1489.773930321" watchObservedRunningTime="2025-10-04 03:05:09.883503272 +0000 UTC m=+1489.780461940" Oct 04 03:05:14 crc kubenswrapper[4964]: I1004 03:05:14.919332 4964 generic.go:334] "Generic (PLEG): container finished" podID="a3b6c804-8b89-4692-b6a2-54f606a700a8" containerID="1321d6376d84230cda55e9a6943cc1f92e5653e51833ca64edb440b72491ee27" exitCode=0 Oct 04 03:05:14 crc kubenswrapper[4964]: I1004 03:05:14.919427 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" event={"ID":"a3b6c804-8b89-4692-b6a2-54f606a700a8","Type":"ContainerDied","Data":"1321d6376d84230cda55e9a6943cc1f92e5653e51833ca64edb440b72491ee27"} Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.462412 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.641064 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc4x5\" (UniqueName: \"kubernetes.io/projected/a3b6c804-8b89-4692-b6a2-54f606a700a8-kube-api-access-bc4x5\") pod \"a3b6c804-8b89-4692-b6a2-54f606a700a8\" (UID: \"a3b6c804-8b89-4692-b6a2-54f606a700a8\") " Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.641234 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3b6c804-8b89-4692-b6a2-54f606a700a8-inventory\") pod \"a3b6c804-8b89-4692-b6a2-54f606a700a8\" (UID: \"a3b6c804-8b89-4692-b6a2-54f606a700a8\") " Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.641320 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3b6c804-8b89-4692-b6a2-54f606a700a8-ssh-key\") pod \"a3b6c804-8b89-4692-b6a2-54f606a700a8\" (UID: \"a3b6c804-8b89-4692-b6a2-54f606a700a8\") " Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.647433 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b6c804-8b89-4692-b6a2-54f606a700a8-kube-api-access-bc4x5" (OuterVolumeSpecName: "kube-api-access-bc4x5") pod "a3b6c804-8b89-4692-b6a2-54f606a700a8" (UID: "a3b6c804-8b89-4692-b6a2-54f606a700a8"). InnerVolumeSpecName "kube-api-access-bc4x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.677563 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b6c804-8b89-4692-b6a2-54f606a700a8-inventory" (OuterVolumeSpecName: "inventory") pod "a3b6c804-8b89-4692-b6a2-54f606a700a8" (UID: "a3b6c804-8b89-4692-b6a2-54f606a700a8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.693675 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b6c804-8b89-4692-b6a2-54f606a700a8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a3b6c804-8b89-4692-b6a2-54f606a700a8" (UID: "a3b6c804-8b89-4692-b6a2-54f606a700a8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.743481 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3b6c804-8b89-4692-b6a2-54f606a700a8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.743512 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc4x5\" (UniqueName: \"kubernetes.io/projected/a3b6c804-8b89-4692-b6a2-54f606a700a8-kube-api-access-bc4x5\") on node \"crc\" DevicePath \"\"" Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.743526 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3b6c804-8b89-4692-b6a2-54f606a700a8-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.954246 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" event={"ID":"a3b6c804-8b89-4692-b6a2-54f606a700a8","Type":"ContainerDied","Data":"ad4f2464a8470fab88c84cf8b32a6486613aa397e4e0566fcfe606958cb274d8"} Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.954751 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad4f2464a8470fab88c84cf8b32a6486613aa397e4e0566fcfe606958cb274d8" Oct 04 03:05:16 crc kubenswrapper[4964]: I1004 03:05:16.954418 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.045658 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5"] Oct 04 03:05:17 crc kubenswrapper[4964]: E1004 03:05:17.046168 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b6c804-8b89-4692-b6a2-54f606a700a8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.046190 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b6c804-8b89-4692-b6a2-54f606a700a8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.046434 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b6c804-8b89-4692-b6a2-54f606a700a8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.047209 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.049944 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.050594 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.051087 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.051423 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.061250 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5"] Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.152234 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/590233c5-797b-4b2c-a0e1-c9123b45ba6e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgxk5\" (UID: \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.152379 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmrml\" (UniqueName: \"kubernetes.io/projected/590233c5-797b-4b2c-a0e1-c9123b45ba6e-kube-api-access-gmrml\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgxk5\" (UID: \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.152601 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/590233c5-797b-4b2c-a0e1-c9123b45ba6e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgxk5\" (UID: \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.255044 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmrml\" (UniqueName: \"kubernetes.io/projected/590233c5-797b-4b2c-a0e1-c9123b45ba6e-kube-api-access-gmrml\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgxk5\" (UID: \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.255200 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/590233c5-797b-4b2c-a0e1-c9123b45ba6e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgxk5\" (UID: \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.255258 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/590233c5-797b-4b2c-a0e1-c9123b45ba6e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgxk5\" (UID: \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.263436 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/590233c5-797b-4b2c-a0e1-c9123b45ba6e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgxk5\" (UID: \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.264742 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/590233c5-797b-4b2c-a0e1-c9123b45ba6e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgxk5\" (UID: \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.278044 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmrml\" (UniqueName: \"kubernetes.io/projected/590233c5-797b-4b2c-a0e1-c9123b45ba6e-kube-api-access-gmrml\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgxk5\" (UID: \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:05:17 crc kubenswrapper[4964]: I1004 03:05:17.382954 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:05:18 crc kubenswrapper[4964]: I1004 03:05:18.012136 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5"] Oct 04 03:05:18 crc kubenswrapper[4964]: I1004 03:05:18.976951 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" event={"ID":"590233c5-797b-4b2c-a0e1-c9123b45ba6e","Type":"ContainerStarted","Data":"296a93a9dac5f37a2dd5ac24d90610f80f60ace59216c585b64114206f5818c4"} Oct 04 03:05:19 crc kubenswrapper[4964]: I1004 03:05:19.993138 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" event={"ID":"590233c5-797b-4b2c-a0e1-c9123b45ba6e","Type":"ContainerStarted","Data":"f93716ed7b6c7d61cf7d3834df510d8076b52b3ef9ba5656f18ec9c5a5a6145c"} Oct 04 03:05:20 crc kubenswrapper[4964]: I1004 03:05:20.034198 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" podStartSLOduration=2.195941323 podStartE2EDuration="3.034174614s" podCreationTimestamp="2025-10-04 03:05:17 +0000 UTC" firstStartedPulling="2025-10-04 03:05:18.019641589 +0000 UTC m=+1497.916600267" lastFinishedPulling="2025-10-04 03:05:18.85787488 +0000 UTC m=+1498.754833558" observedRunningTime="2025-10-04 03:05:20.020474378 +0000 UTC m=+1499.917433046" watchObservedRunningTime="2025-10-04 03:05:20.034174614 +0000 UTC m=+1499.931133282" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.449873 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.452594 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.452937 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.454415 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.454805 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" gracePeriod=600 Oct 04 03:05:34 crc kubenswrapper[4964]: E1004 03:05:34.585034 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.594764 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kk2qw"] Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.596712 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.619486 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kk2qw"] Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.624466 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-catalog-content\") pod \"redhat-operators-kk2qw\" (UID: \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\") " pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.624502 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ptz9\" (UniqueName: \"kubernetes.io/projected/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-kube-api-access-8ptz9\") pod \"redhat-operators-kk2qw\" (UID: \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\") " pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.624552 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-utilities\") pod \"redhat-operators-kk2qw\" (UID: \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\") " pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.726290 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-catalog-content\") pod \"redhat-operators-kk2qw\" (UID: \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\") " pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.726344 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ptz9\" (UniqueName: \"kubernetes.io/projected/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-kube-api-access-8ptz9\") pod \"redhat-operators-kk2qw\" (UID: \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\") " pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.726404 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-utilities\") pod \"redhat-operators-kk2qw\" (UID: \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\") " pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.727146 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-utilities\") pod \"redhat-operators-kk2qw\" (UID: \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\") " pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.727159 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-catalog-content\") pod \"redhat-operators-kk2qw\" (UID: \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\") " pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.749954 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ptz9\" (UniqueName: \"kubernetes.io/projected/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-kube-api-access-8ptz9\") pod \"redhat-operators-kk2qw\" (UID: \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\") " pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:34 crc kubenswrapper[4964]: I1004 03:05:34.959308 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:35 crc kubenswrapper[4964]: I1004 03:05:35.185660 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" exitCode=0 Oct 04 03:05:35 crc kubenswrapper[4964]: I1004 03:05:35.185803 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16"} Oct 04 03:05:35 crc kubenswrapper[4964]: I1004 03:05:35.186056 4964 scope.go:117] "RemoveContainer" containerID="8ac79a9f0f6a28341a809172243b68b1f44d642d4092bfec8fae089b115215b9" Oct 04 03:05:35 crc kubenswrapper[4964]: I1004 03:05:35.186899 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:05:35 crc kubenswrapper[4964]: E1004 03:05:35.187199 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:05:35 crc kubenswrapper[4964]: I1004 03:05:35.428241 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kk2qw"] Oct 04 03:05:35 crc kubenswrapper[4964]: W1004 03:05:35.436307 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod514d7cc6_0bf8_4be0_8d81_b77ab73dc73a.slice/crio-25edd54be24ced85f06d5b7f18e954a6810fcc87dda7b71f0c2dbf289933f3a5 WatchSource:0}: Error finding container 25edd54be24ced85f06d5b7f18e954a6810fcc87dda7b71f0c2dbf289933f3a5: Status 404 returned error can't find the container with id 25edd54be24ced85f06d5b7f18e954a6810fcc87dda7b71f0c2dbf289933f3a5 Oct 04 03:05:36 crc kubenswrapper[4964]: I1004 03:05:36.194583 4964 generic.go:334] "Generic (PLEG): container finished" podID="514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" containerID="177f38a0b1f15483e738aeeef3bbfaeca64c77fc83b6cb1621967793179d9e41" exitCode=0 Oct 04 03:05:36 crc kubenswrapper[4964]: I1004 03:05:36.194673 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk2qw" event={"ID":"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a","Type":"ContainerDied","Data":"177f38a0b1f15483e738aeeef3bbfaeca64c77fc83b6cb1621967793179d9e41"} Oct 04 03:05:36 crc kubenswrapper[4964]: I1004 03:05:36.194705 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk2qw" event={"ID":"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a","Type":"ContainerStarted","Data":"25edd54be24ced85f06d5b7f18e954a6810fcc87dda7b71f0c2dbf289933f3a5"} Oct 04 03:05:36 crc kubenswrapper[4964]: I1004 03:05:36.197381 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 03:05:38 crc kubenswrapper[4964]: I1004 03:05:38.228036 4964 generic.go:334] "Generic (PLEG): container finished" podID="514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" containerID="4ec46343eb5bdcb885f2cec1c570936daa5267793c7bcb416005a8f05de31778" exitCode=0 Oct 04 03:05:38 crc kubenswrapper[4964]: I1004 03:05:38.228404 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk2qw" event={"ID":"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a","Type":"ContainerDied","Data":"4ec46343eb5bdcb885f2cec1c570936daa5267793c7bcb416005a8f05de31778"} Oct 04 03:05:39 crc kubenswrapper[4964]: I1004 03:05:39.241766 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk2qw" event={"ID":"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a","Type":"ContainerStarted","Data":"47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222"} Oct 04 03:05:39 crc kubenswrapper[4964]: I1004 03:05:39.280846 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kk2qw" podStartSLOduration=2.659061022 podStartE2EDuration="5.280825466s" podCreationTimestamp="2025-10-04 03:05:34 +0000 UTC" firstStartedPulling="2025-10-04 03:05:36.197025547 +0000 UTC m=+1516.093984195" lastFinishedPulling="2025-10-04 03:05:38.818789961 +0000 UTC m=+1518.715748639" observedRunningTime="2025-10-04 03:05:39.265766384 +0000 UTC m=+1519.162725082" watchObservedRunningTime="2025-10-04 03:05:39.280825466 +0000 UTC m=+1519.177784124" Oct 04 03:05:44 crc kubenswrapper[4964]: I1004 03:05:44.073196 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-srz4q"] Oct 04 03:05:44 crc kubenswrapper[4964]: I1004 03:05:44.083414 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-srz4q"] Oct 04 03:05:44 crc kubenswrapper[4964]: I1004 03:05:44.093883 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-vgrv6"] Oct 04 03:05:44 crc kubenswrapper[4964]: I1004 03:05:44.105883 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-vgrv6"] Oct 04 03:05:44 crc kubenswrapper[4964]: I1004 03:05:44.864037 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3219f838-82a7-4145-aa9c-e7dd4557d10b" path="/var/lib/kubelet/pods/3219f838-82a7-4145-aa9c-e7dd4557d10b/volumes" Oct 04 03:05:44 crc kubenswrapper[4964]: I1004 03:05:44.865752 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3268441f-8f21-41a2-a231-4791cd94f615" path="/var/lib/kubelet/pods/3268441f-8f21-41a2-a231-4791cd94f615/volumes" Oct 04 03:05:44 crc kubenswrapper[4964]: I1004 03:05:44.959888 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:44 crc kubenswrapper[4964]: I1004 03:05:44.959955 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:45 crc kubenswrapper[4964]: I1004 03:05:45.043752 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:45 crc kubenswrapper[4964]: I1004 03:05:45.376570 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:45 crc kubenswrapper[4964]: I1004 03:05:45.435798 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kk2qw"] Oct 04 03:05:45 crc kubenswrapper[4964]: I1004 03:05:45.855851 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:05:45 crc kubenswrapper[4964]: E1004 03:05:45.861980 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:05:47 crc kubenswrapper[4964]: I1004 03:05:47.345603 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kk2qw" podUID="514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" containerName="registry-server" containerID="cri-o://47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222" gracePeriod=2 Oct 04 03:05:47 crc kubenswrapper[4964]: I1004 03:05:47.932715 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.125563 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-catalog-content\") pod \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\" (UID: \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\") " Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.126044 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ptz9\" (UniqueName: \"kubernetes.io/projected/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-kube-api-access-8ptz9\") pod \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\" (UID: \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\") " Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.126156 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-utilities\") pod \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\" (UID: \"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a\") " Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.127490 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-utilities" (OuterVolumeSpecName: "utilities") pod "514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" (UID: "514d7cc6-0bf8-4be0-8d81-b77ab73dc73a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.133271 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-kube-api-access-8ptz9" (OuterVolumeSpecName: "kube-api-access-8ptz9") pod "514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" (UID: "514d7cc6-0bf8-4be0-8d81-b77ab73dc73a"). InnerVolumeSpecName "kube-api-access-8ptz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.209367 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" (UID: "514d7cc6-0bf8-4be0-8d81-b77ab73dc73a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.228984 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ptz9\" (UniqueName: \"kubernetes.io/projected/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-kube-api-access-8ptz9\") on node \"crc\" DevicePath \"\"" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.229032 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.229051 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.357786 4964 generic.go:334] "Generic (PLEG): container finished" podID="514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" containerID="47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222" exitCode=0 Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.357837 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk2qw" event={"ID":"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a","Type":"ContainerDied","Data":"47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222"} Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.357873 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk2qw" event={"ID":"514d7cc6-0bf8-4be0-8d81-b77ab73dc73a","Type":"ContainerDied","Data":"25edd54be24ced85f06d5b7f18e954a6810fcc87dda7b71f0c2dbf289933f3a5"} Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.357894 4964 scope.go:117] "RemoveContainer" containerID="47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.357924 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kk2qw" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.377991 4964 scope.go:117] "RemoveContainer" containerID="4ec46343eb5bdcb885f2cec1c570936daa5267793c7bcb416005a8f05de31778" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.403631 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kk2qw"] Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.406356 4964 scope.go:117] "RemoveContainer" containerID="177f38a0b1f15483e738aeeef3bbfaeca64c77fc83b6cb1621967793179d9e41" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.409904 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kk2qw"] Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.456747 4964 scope.go:117] "RemoveContainer" containerID="47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222" Oct 04 03:05:48 crc kubenswrapper[4964]: E1004 03:05:48.457242 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222\": container with ID starting with 47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222 not found: ID does not exist" containerID="47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.457282 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222"} err="failed to get container status \"47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222\": rpc error: code = NotFound desc = could not find container \"47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222\": container with ID starting with 47505fa46335c86d82d5b9fb5e957671dfbc27bd2d9c20c6e0de496c0f3f4222 not found: ID does not exist" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.457306 4964 scope.go:117] "RemoveContainer" containerID="4ec46343eb5bdcb885f2cec1c570936daa5267793c7bcb416005a8f05de31778" Oct 04 03:05:48 crc kubenswrapper[4964]: E1004 03:05:48.457738 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec46343eb5bdcb885f2cec1c570936daa5267793c7bcb416005a8f05de31778\": container with ID starting with 4ec46343eb5bdcb885f2cec1c570936daa5267793c7bcb416005a8f05de31778 not found: ID does not exist" containerID="4ec46343eb5bdcb885f2cec1c570936daa5267793c7bcb416005a8f05de31778" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.457788 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec46343eb5bdcb885f2cec1c570936daa5267793c7bcb416005a8f05de31778"} err="failed to get container status \"4ec46343eb5bdcb885f2cec1c570936daa5267793c7bcb416005a8f05de31778\": rpc error: code = NotFound desc = could not find container \"4ec46343eb5bdcb885f2cec1c570936daa5267793c7bcb416005a8f05de31778\": container with ID starting with 4ec46343eb5bdcb885f2cec1c570936daa5267793c7bcb416005a8f05de31778 not found: ID does not exist" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.457827 4964 scope.go:117] "RemoveContainer" containerID="177f38a0b1f15483e738aeeef3bbfaeca64c77fc83b6cb1621967793179d9e41" Oct 04 03:05:48 crc kubenswrapper[4964]: E1004 03:05:48.458136 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"177f38a0b1f15483e738aeeef3bbfaeca64c77fc83b6cb1621967793179d9e41\": container with ID starting with 177f38a0b1f15483e738aeeef3bbfaeca64c77fc83b6cb1621967793179d9e41 not found: ID does not exist" containerID="177f38a0b1f15483e738aeeef3bbfaeca64c77fc83b6cb1621967793179d9e41" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.458187 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"177f38a0b1f15483e738aeeef3bbfaeca64c77fc83b6cb1621967793179d9e41"} err="failed to get container status \"177f38a0b1f15483e738aeeef3bbfaeca64c77fc83b6cb1621967793179d9e41\": rpc error: code = NotFound desc = could not find container \"177f38a0b1f15483e738aeeef3bbfaeca64c77fc83b6cb1621967793179d9e41\": container with ID starting with 177f38a0b1f15483e738aeeef3bbfaeca64c77fc83b6cb1621967793179d9e41 not found: ID does not exist" Oct 04 03:05:48 crc kubenswrapper[4964]: I1004 03:05:48.855735 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" path="/var/lib/kubelet/pods/514d7cc6-0bf8-4be0-8d81-b77ab73dc73a/volumes" Oct 04 03:05:50 crc kubenswrapper[4964]: I1004 03:05:50.030592 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-5bxql"] Oct 04 03:05:50 crc kubenswrapper[4964]: I1004 03:05:50.040398 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-5bxql"] Oct 04 03:05:50 crc kubenswrapper[4964]: I1004 03:05:50.858774 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2435773b-23eb-4a67-b454-4531b0c41831" path="/var/lib/kubelet/pods/2435773b-23eb-4a67-b454-4531b0c41831/volumes" Oct 04 03:05:54 crc kubenswrapper[4964]: I1004 03:05:54.032869 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ee08-account-create-frb92"] Oct 04 03:05:54 crc kubenswrapper[4964]: I1004 03:05:54.042106 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ee08-account-create-frb92"] Oct 04 03:05:54 crc kubenswrapper[4964]: I1004 03:05:54.865869 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5035eec8-0929-4248-a007-d4dda1330e10" path="/var/lib/kubelet/pods/5035eec8-0929-4248-a007-d4dda1330e10/volumes" Oct 04 03:05:55 crc kubenswrapper[4964]: I1004 03:05:55.057987 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b6de-account-create-qzzbp"] Oct 04 03:05:55 crc kubenswrapper[4964]: I1004 03:05:55.082025 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b6de-account-create-qzzbp"] Oct 04 03:05:56 crc kubenswrapper[4964]: I1004 03:05:56.861055 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eae45e80-cbad-42ac-a8a3-1141611d5f2d" path="/var/lib/kubelet/pods/eae45e80-cbad-42ac-a8a3-1141611d5f2d/volumes" Oct 04 03:05:58 crc kubenswrapper[4964]: I1004 03:05:58.845498 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:05:58 crc kubenswrapper[4964]: E1004 03:05:58.846455 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:05:59 crc kubenswrapper[4964]: I1004 03:05:59.476840 4964 generic.go:334] "Generic (PLEG): container finished" podID="590233c5-797b-4b2c-a0e1-c9123b45ba6e" containerID="f93716ed7b6c7d61cf7d3834df510d8076b52b3ef9ba5656f18ec9c5a5a6145c" exitCode=0 Oct 04 03:05:59 crc kubenswrapper[4964]: I1004 03:05:59.476879 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" event={"ID":"590233c5-797b-4b2c-a0e1-c9123b45ba6e","Type":"ContainerDied","Data":"f93716ed7b6c7d61cf7d3834df510d8076b52b3ef9ba5656f18ec9c5a5a6145c"} Oct 04 03:06:00 crc kubenswrapper[4964]: I1004 03:06:00.049532 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1840-account-create-hnvtp"] Oct 04 03:06:00 crc kubenswrapper[4964]: I1004 03:06:00.062183 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1840-account-create-hnvtp"] Oct 04 03:06:00 crc kubenswrapper[4964]: I1004 03:06:00.858313 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f231c3-73a1-4152-b618-ba3ac0e2b7f3" path="/var/lib/kubelet/pods/46f231c3-73a1-4152-b618-ba3ac0e2b7f3/volumes" Oct 04 03:06:00 crc kubenswrapper[4964]: I1004 03:06:00.952253 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:06:00 crc kubenswrapper[4964]: I1004 03:06:00.980087 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/590233c5-797b-4b2c-a0e1-c9123b45ba6e-inventory\") pod \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\" (UID: \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\") " Oct 04 03:06:00 crc kubenswrapper[4964]: I1004 03:06:00.980164 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmrml\" (UniqueName: \"kubernetes.io/projected/590233c5-797b-4b2c-a0e1-c9123b45ba6e-kube-api-access-gmrml\") pod \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\" (UID: \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\") " Oct 04 03:06:00 crc kubenswrapper[4964]: I1004 03:06:00.980501 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/590233c5-797b-4b2c-a0e1-c9123b45ba6e-ssh-key\") pod \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\" (UID: \"590233c5-797b-4b2c-a0e1-c9123b45ba6e\") " Oct 04 03:06:00 crc kubenswrapper[4964]: I1004 03:06:00.989835 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590233c5-797b-4b2c-a0e1-c9123b45ba6e-kube-api-access-gmrml" (OuterVolumeSpecName: "kube-api-access-gmrml") pod "590233c5-797b-4b2c-a0e1-c9123b45ba6e" (UID: "590233c5-797b-4b2c-a0e1-c9123b45ba6e"). InnerVolumeSpecName "kube-api-access-gmrml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.013193 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590233c5-797b-4b2c-a0e1-c9123b45ba6e-inventory" (OuterVolumeSpecName: "inventory") pod "590233c5-797b-4b2c-a0e1-c9123b45ba6e" (UID: "590233c5-797b-4b2c-a0e1-c9123b45ba6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.016732 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590233c5-797b-4b2c-a0e1-c9123b45ba6e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "590233c5-797b-4b2c-a0e1-c9123b45ba6e" (UID: "590233c5-797b-4b2c-a0e1-c9123b45ba6e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.081805 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/590233c5-797b-4b2c-a0e1-c9123b45ba6e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.081843 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/590233c5-797b-4b2c-a0e1-c9123b45ba6e-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.081856 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmrml\" (UniqueName: \"kubernetes.io/projected/590233c5-797b-4b2c-a0e1-c9123b45ba6e-kube-api-access-gmrml\") on node \"crc\" DevicePath \"\"" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.501844 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" event={"ID":"590233c5-797b-4b2c-a0e1-c9123b45ba6e","Type":"ContainerDied","Data":"296a93a9dac5f37a2dd5ac24d90610f80f60ace59216c585b64114206f5818c4"} Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.501899 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="296a93a9dac5f37a2dd5ac24d90610f80f60ace59216c585b64114206f5818c4" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.501947 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.632394 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq"] Oct 04 03:06:01 crc kubenswrapper[4964]: E1004 03:06:01.632867 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" containerName="extract-content" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.632887 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" containerName="extract-content" Oct 04 03:06:01 crc kubenswrapper[4964]: E1004 03:06:01.632911 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" containerName="registry-server" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.632921 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" containerName="registry-server" Oct 04 03:06:01 crc kubenswrapper[4964]: E1004 03:06:01.632936 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590233c5-797b-4b2c-a0e1-c9123b45ba6e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.632947 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="590233c5-797b-4b2c-a0e1-c9123b45ba6e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:06:01 crc kubenswrapper[4964]: E1004 03:06:01.632987 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" containerName="extract-utilities" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.632996 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" containerName="extract-utilities" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.633188 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="590233c5-797b-4b2c-a0e1-c9123b45ba6e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.633209 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="514d7cc6-0bf8-4be0-8d81-b77ab73dc73a" containerName="registry-server" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.633910 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.642420 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.642894 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.643129 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.643143 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.650183 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq"] Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.798738 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50426398-6c3b-4482-a791-f5f98ec0f076-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq\" (UID: \"50426398-6c3b-4482-a791-f5f98ec0f076\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.798903 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50426398-6c3b-4482-a791-f5f98ec0f076-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq\" (UID: \"50426398-6c3b-4482-a791-f5f98ec0f076\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.798999 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjpkx\" (UniqueName: \"kubernetes.io/projected/50426398-6c3b-4482-a791-f5f98ec0f076-kube-api-access-bjpkx\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq\" (UID: \"50426398-6c3b-4482-a791-f5f98ec0f076\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.900313 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpkx\" (UniqueName: \"kubernetes.io/projected/50426398-6c3b-4482-a791-f5f98ec0f076-kube-api-access-bjpkx\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq\" (UID: \"50426398-6c3b-4482-a791-f5f98ec0f076\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.900631 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50426398-6c3b-4482-a791-f5f98ec0f076-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq\" (UID: \"50426398-6c3b-4482-a791-f5f98ec0f076\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.900705 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50426398-6c3b-4482-a791-f5f98ec0f076-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq\" (UID: \"50426398-6c3b-4482-a791-f5f98ec0f076\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.904086 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50426398-6c3b-4482-a791-f5f98ec0f076-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq\" (UID: \"50426398-6c3b-4482-a791-f5f98ec0f076\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.910360 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50426398-6c3b-4482-a791-f5f98ec0f076-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq\" (UID: \"50426398-6c3b-4482-a791-f5f98ec0f076\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.923458 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjpkx\" (UniqueName: \"kubernetes.io/projected/50426398-6c3b-4482-a791-f5f98ec0f076-kube-api-access-bjpkx\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq\" (UID: \"50426398-6c3b-4482-a791-f5f98ec0f076\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:01 crc kubenswrapper[4964]: I1004 03:06:01.966180 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:02 crc kubenswrapper[4964]: I1004 03:06:02.277766 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq"] Oct 04 03:06:02 crc kubenswrapper[4964]: I1004 03:06:02.516584 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" event={"ID":"50426398-6c3b-4482-a791-f5f98ec0f076","Type":"ContainerStarted","Data":"dddb97ad1360104ff7ace259f344416ad5d1dcaace3bec7a0a63ea24fb389abf"} Oct 04 03:06:03 crc kubenswrapper[4964]: I1004 03:06:03.528694 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" event={"ID":"50426398-6c3b-4482-a791-f5f98ec0f076","Type":"ContainerStarted","Data":"bf4f6baf90ed202d3070a26fd63fd75c14dcee462ccb781ea45612d8796b29ff"} Oct 04 03:06:03 crc kubenswrapper[4964]: I1004 03:06:03.564400 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" podStartSLOduration=2.164829543 podStartE2EDuration="2.564372864s" podCreationTimestamp="2025-10-04 03:06:01 +0000 UTC" firstStartedPulling="2025-10-04 03:06:02.285899365 +0000 UTC m=+1542.182858003" lastFinishedPulling="2025-10-04 03:06:02.685442656 +0000 UTC m=+1542.582401324" observedRunningTime="2025-10-04 03:06:03.550839352 +0000 UTC m=+1543.447798020" watchObservedRunningTime="2025-10-04 03:06:03.564372864 +0000 UTC m=+1543.461331532" Oct 04 03:06:07 crc kubenswrapper[4964]: I1004 03:06:07.581016 4964 generic.go:334] "Generic (PLEG): container finished" podID="50426398-6c3b-4482-a791-f5f98ec0f076" containerID="bf4f6baf90ed202d3070a26fd63fd75c14dcee462ccb781ea45612d8796b29ff" exitCode=0 Oct 04 03:06:07 crc kubenswrapper[4964]: I1004 03:06:07.581055 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" event={"ID":"50426398-6c3b-4482-a791-f5f98ec0f076","Type":"ContainerDied","Data":"bf4f6baf90ed202d3070a26fd63fd75c14dcee462ccb781ea45612d8796b29ff"} Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.089943 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.187103 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50426398-6c3b-4482-a791-f5f98ec0f076-ssh-key\") pod \"50426398-6c3b-4482-a791-f5f98ec0f076\" (UID: \"50426398-6c3b-4482-a791-f5f98ec0f076\") " Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.187296 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjpkx\" (UniqueName: \"kubernetes.io/projected/50426398-6c3b-4482-a791-f5f98ec0f076-kube-api-access-bjpkx\") pod \"50426398-6c3b-4482-a791-f5f98ec0f076\" (UID: \"50426398-6c3b-4482-a791-f5f98ec0f076\") " Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.187349 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50426398-6c3b-4482-a791-f5f98ec0f076-inventory\") pod \"50426398-6c3b-4482-a791-f5f98ec0f076\" (UID: \"50426398-6c3b-4482-a791-f5f98ec0f076\") " Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.202508 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50426398-6c3b-4482-a791-f5f98ec0f076-kube-api-access-bjpkx" (OuterVolumeSpecName: "kube-api-access-bjpkx") pod "50426398-6c3b-4482-a791-f5f98ec0f076" (UID: "50426398-6c3b-4482-a791-f5f98ec0f076"). InnerVolumeSpecName "kube-api-access-bjpkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.242984 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50426398-6c3b-4482-a791-f5f98ec0f076-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "50426398-6c3b-4482-a791-f5f98ec0f076" (UID: "50426398-6c3b-4482-a791-f5f98ec0f076"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.244163 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50426398-6c3b-4482-a791-f5f98ec0f076-inventory" (OuterVolumeSpecName: "inventory") pod "50426398-6c3b-4482-a791-f5f98ec0f076" (UID: "50426398-6c3b-4482-a791-f5f98ec0f076"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.289690 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjpkx\" (UniqueName: \"kubernetes.io/projected/50426398-6c3b-4482-a791-f5f98ec0f076-kube-api-access-bjpkx\") on node \"crc\" DevicePath \"\"" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.289748 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50426398-6c3b-4482-a791-f5f98ec0f076-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.289767 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50426398-6c3b-4482-a791-f5f98ec0f076-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.613121 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" event={"ID":"50426398-6c3b-4482-a791-f5f98ec0f076","Type":"ContainerDied","Data":"dddb97ad1360104ff7ace259f344416ad5d1dcaace3bec7a0a63ea24fb389abf"} Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.613451 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dddb97ad1360104ff7ace259f344416ad5d1dcaace3bec7a0a63ea24fb389abf" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.613529 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.710639 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw"] Oct 04 03:06:09 crc kubenswrapper[4964]: E1004 03:06:09.711306 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50426398-6c3b-4482-a791-f5f98ec0f076" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.711327 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="50426398-6c3b-4482-a791-f5f98ec0f076" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.711569 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="50426398-6c3b-4482-a791-f5f98ec0f076" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.712299 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.718880 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.719263 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.719552 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.722663 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.728858 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw"] Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.800024 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c408fde-7339-474f-a9c4-c028570fbd40-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw\" (UID: \"7c408fde-7339-474f-a9c4-c028570fbd40\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.800129 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7c408fde-7339-474f-a9c4-c028570fbd40-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw\" (UID: \"7c408fde-7339-474f-a9c4-c028570fbd40\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.800209 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbvx8\" (UniqueName: \"kubernetes.io/projected/7c408fde-7339-474f-a9c4-c028570fbd40-kube-api-access-hbvx8\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw\" (UID: \"7c408fde-7339-474f-a9c4-c028570fbd40\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.902037 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbvx8\" (UniqueName: \"kubernetes.io/projected/7c408fde-7339-474f-a9c4-c028570fbd40-kube-api-access-hbvx8\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw\" (UID: \"7c408fde-7339-474f-a9c4-c028570fbd40\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.903498 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c408fde-7339-474f-a9c4-c028570fbd40-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw\" (UID: \"7c408fde-7339-474f-a9c4-c028570fbd40\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.903668 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7c408fde-7339-474f-a9c4-c028570fbd40-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw\" (UID: \"7c408fde-7339-474f-a9c4-c028570fbd40\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.907545 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c408fde-7339-474f-a9c4-c028570fbd40-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw\" (UID: \"7c408fde-7339-474f-a9c4-c028570fbd40\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.908715 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7c408fde-7339-474f-a9c4-c028570fbd40-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw\" (UID: \"7c408fde-7339-474f-a9c4-c028570fbd40\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:06:09 crc kubenswrapper[4964]: I1004 03:06:09.928267 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbvx8\" (UniqueName: \"kubernetes.io/projected/7c408fde-7339-474f-a9c4-c028570fbd40-kube-api-access-hbvx8\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw\" (UID: \"7c408fde-7339-474f-a9c4-c028570fbd40\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:06:10 crc kubenswrapper[4964]: I1004 03:06:10.047127 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:06:10 crc kubenswrapper[4964]: I1004 03:06:10.632499 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw"] Oct 04 03:06:11 crc kubenswrapper[4964]: I1004 03:06:11.635553 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" event={"ID":"7c408fde-7339-474f-a9c4-c028570fbd40","Type":"ContainerStarted","Data":"90b7151b7982209568cd6b2c39af571f19cd429a8334b88edd32faed69f09b6a"} Oct 04 03:06:11 crc kubenswrapper[4964]: I1004 03:06:11.635850 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" event={"ID":"7c408fde-7339-474f-a9c4-c028570fbd40","Type":"ContainerStarted","Data":"e5edbe4e135d394eb69780deba2d0a6892d686f9af8fce6f7e5f73646f97d20f"} Oct 04 03:06:11 crc kubenswrapper[4964]: I1004 03:06:11.661026 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" podStartSLOduration=2.115993544 podStartE2EDuration="2.661010796s" podCreationTimestamp="2025-10-04 03:06:09 +0000 UTC" firstStartedPulling="2025-10-04 03:06:10.640720865 +0000 UTC m=+1550.537679493" lastFinishedPulling="2025-10-04 03:06:11.185738107 +0000 UTC m=+1551.082696745" observedRunningTime="2025-10-04 03:06:11.657666959 +0000 UTC m=+1551.554625637" watchObservedRunningTime="2025-10-04 03:06:11.661010796 +0000 UTC m=+1551.557969444" Oct 04 03:06:11 crc kubenswrapper[4964]: I1004 03:06:11.844763 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:06:11 crc kubenswrapper[4964]: E1004 03:06:11.845090 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:06:22 crc kubenswrapper[4964]: I1004 03:06:22.077285 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-prrvx"] Oct 04 03:06:22 crc kubenswrapper[4964]: I1004 03:06:22.094317 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-ck4kh"] Oct 04 03:06:22 crc kubenswrapper[4964]: I1004 03:06:22.101088 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-c4xjv"] Oct 04 03:06:22 crc kubenswrapper[4964]: I1004 03:06:22.107427 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-prrvx"] Oct 04 03:06:22 crc kubenswrapper[4964]: I1004 03:06:22.114778 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-ck4kh"] Oct 04 03:06:22 crc kubenswrapper[4964]: I1004 03:06:22.123704 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-c4xjv"] Oct 04 03:06:22 crc kubenswrapper[4964]: I1004 03:06:22.863421 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="052d791b-de97-4d7c-b150-81e9fec1e0fc" path="/var/lib/kubelet/pods/052d791b-de97-4d7c-b150-81e9fec1e0fc/volumes" Oct 04 03:06:22 crc kubenswrapper[4964]: I1004 03:06:22.864517 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cc3e181-d921-4612-94f9-525ee8a91275" path="/var/lib/kubelet/pods/1cc3e181-d921-4612-94f9-525ee8a91275/volumes" Oct 04 03:06:22 crc kubenswrapper[4964]: I1004 03:06:22.865530 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c86e7f-453a-4cc5-a487-cd5ada7f25d2" path="/var/lib/kubelet/pods/58c86e7f-453a-4cc5-a487-cd5ada7f25d2/volumes" Oct 04 03:06:23 crc kubenswrapper[4964]: I1004 03:06:23.845859 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:06:23 crc kubenswrapper[4964]: E1004 03:06:23.846646 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:06:24 crc kubenswrapper[4964]: I1004 03:06:24.033664 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vjrgk"] Oct 04 03:06:24 crc kubenswrapper[4964]: I1004 03:06:24.049913 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vjrgk"] Oct 04 03:06:24 crc kubenswrapper[4964]: I1004 03:06:24.861046 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1021ace-0fb9-45a8-b83b-12a487b37bf3" path="/var/lib/kubelet/pods/f1021ace-0fb9-45a8-b83b-12a487b37bf3/volumes" Oct 04 03:06:26 crc kubenswrapper[4964]: I1004 03:06:26.045154 4964 scope.go:117] "RemoveContainer" containerID="24b93231cc2d64dccaf39d13f08f312ae902afdec4c74a08d7d2d545c575c86c" Oct 04 03:06:26 crc kubenswrapper[4964]: I1004 03:06:26.075688 4964 scope.go:117] "RemoveContainer" containerID="ed6e40cfb81a9abf65b179922e645cd4b43fd20d669c83bc2a3a2ba42bc33617" Oct 04 03:06:26 crc kubenswrapper[4964]: I1004 03:06:26.110812 4964 scope.go:117] "RemoveContainer" containerID="776d04b224e16a7471121aba66de0fe29e36e9d403433e30608db87af4ea8708" Oct 04 03:06:26 crc kubenswrapper[4964]: I1004 03:06:26.158326 4964 scope.go:117] "RemoveContainer" containerID="ca8c25952e5ed21fcf9a509892af37cbd5efca74f8a26a776ef9957e78998dbe" Oct 04 03:06:26 crc kubenswrapper[4964]: I1004 03:06:26.197015 4964 scope.go:117] "RemoveContainer" containerID="fbda6431e83369b5ab7b7104d3d38ef14505db0edfe5f0c58e080a35e0da541f" Oct 04 03:06:26 crc kubenswrapper[4964]: I1004 03:06:26.233119 4964 scope.go:117] "RemoveContainer" containerID="073961a4654faf47c49dbee30cc94672c0be75e3c5999128f54a47b58cf20d77" Oct 04 03:06:26 crc kubenswrapper[4964]: I1004 03:06:26.305367 4964 scope.go:117] "RemoveContainer" containerID="c76481151603b19eeb7778b8cf149fbbaa539806023499d9af1dfdb1118986d7" Oct 04 03:06:26 crc kubenswrapper[4964]: I1004 03:06:26.328078 4964 scope.go:117] "RemoveContainer" containerID="c50ac3544af7a9cfccb3d272e2dcae48a2ca1d92d4cff67360e26ca06ef705c1" Oct 04 03:06:26 crc kubenswrapper[4964]: I1004 03:06:26.347681 4964 scope.go:117] "RemoveContainer" containerID="6c7cee255db4c04d0824d62a05ab8b73d451c8340899d7864ebaf68d71e70aab" Oct 04 03:06:26 crc kubenswrapper[4964]: I1004 03:06:26.373648 4964 scope.go:117] "RemoveContainer" containerID="73183fd374e58f891850fec52755929e6de0fb9eeb1c5f46d9b82630c248f241" Oct 04 03:06:31 crc kubenswrapper[4964]: I1004 03:06:31.053695 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-45kld"] Oct 04 03:06:31 crc kubenswrapper[4964]: I1004 03:06:31.068851 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-45kld"] Oct 04 03:06:32 crc kubenswrapper[4964]: I1004 03:06:32.866490 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e16345-cf64-413c-a394-35c20d93aa02" path="/var/lib/kubelet/pods/34e16345-cf64-413c-a394-35c20d93aa02/volumes" Oct 04 03:06:34 crc kubenswrapper[4964]: I1004 03:06:34.036196 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4f8a-account-create-flfbn"] Oct 04 03:06:34 crc kubenswrapper[4964]: I1004 03:06:34.049310 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0b5c-account-create-nwmgs"] Oct 04 03:06:34 crc kubenswrapper[4964]: I1004 03:06:34.060060 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6c4d-account-create-rv5nz"] Oct 04 03:06:34 crc kubenswrapper[4964]: I1004 03:06:34.066714 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4f8a-account-create-flfbn"] Oct 04 03:06:34 crc kubenswrapper[4964]: I1004 03:06:34.072650 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6c4d-account-create-rv5nz"] Oct 04 03:06:34 crc kubenswrapper[4964]: I1004 03:06:34.078274 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0b5c-account-create-nwmgs"] Oct 04 03:06:34 crc kubenswrapper[4964]: I1004 03:06:34.844967 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:06:34 crc kubenswrapper[4964]: E1004 03:06:34.845678 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:06:34 crc kubenswrapper[4964]: I1004 03:06:34.873953 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26dddf74-bd10-4534-9c12-8fd8c9311475" path="/var/lib/kubelet/pods/26dddf74-bd10-4534-9c12-8fd8c9311475/volumes" Oct 04 03:06:34 crc kubenswrapper[4964]: I1004 03:06:34.874934 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81b5818e-cfbf-49bc-995f-b7024d46b020" path="/var/lib/kubelet/pods/81b5818e-cfbf-49bc-995f-b7024d46b020/volumes" Oct 04 03:06:34 crc kubenswrapper[4964]: I1004 03:06:34.875535 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2fdf522-a1fc-4ce6-a819-533ac61f5a5d" path="/var/lib/kubelet/pods/e2fdf522-a1fc-4ce6-a819-533ac61f5a5d/volumes" Oct 04 03:06:45 crc kubenswrapper[4964]: I1004 03:06:45.055030 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-nl7sp"] Oct 04 03:06:45 crc kubenswrapper[4964]: I1004 03:06:45.063242 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-nl7sp"] Oct 04 03:06:45 crc kubenswrapper[4964]: I1004 03:06:45.848662 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:06:45 crc kubenswrapper[4964]: E1004 03:06:45.849091 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:06:46 crc kubenswrapper[4964]: I1004 03:06:46.865833 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da802d94-8d77-4b2c-88a0-3edc6e7c115b" path="/var/lib/kubelet/pods/da802d94-8d77-4b2c-88a0-3edc6e7c115b/volumes" Oct 04 03:06:47 crc kubenswrapper[4964]: I1004 03:06:47.037975 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-z4k2k"] Oct 04 03:06:47 crc kubenswrapper[4964]: I1004 03:06:47.046533 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-z4k2k"] Oct 04 03:06:48 crc kubenswrapper[4964]: I1004 03:06:48.863702 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dda8845-8294-4367-a5e7-055b6e6711a3" path="/var/lib/kubelet/pods/4dda8845-8294-4367-a5e7-055b6e6711a3/volumes" Oct 04 03:06:57 crc kubenswrapper[4964]: I1004 03:06:57.845436 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:06:57 crc kubenswrapper[4964]: E1004 03:06:57.846430 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:07:01 crc kubenswrapper[4964]: I1004 03:07:01.045708 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-bwnzx"] Oct 04 03:07:01 crc kubenswrapper[4964]: I1004 03:07:01.055278 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-bwnzx"] Oct 04 03:07:02 crc kubenswrapper[4964]: I1004 03:07:02.864420 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e6a68d-94f3-4485-9bbd-eccc7a9398d2" path="/var/lib/kubelet/pods/79e6a68d-94f3-4485-9bbd-eccc7a9398d2/volumes" Oct 04 03:07:09 crc kubenswrapper[4964]: I1004 03:07:09.274065 4964 generic.go:334] "Generic (PLEG): container finished" podID="7c408fde-7339-474f-a9c4-c028570fbd40" containerID="90b7151b7982209568cd6b2c39af571f19cd429a8334b88edd32faed69f09b6a" exitCode=2 Oct 04 03:07:09 crc kubenswrapper[4964]: I1004 03:07:09.274154 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" event={"ID":"7c408fde-7339-474f-a9c4-c028570fbd40","Type":"ContainerDied","Data":"90b7151b7982209568cd6b2c39af571f19cd429a8334b88edd32faed69f09b6a"} Oct 04 03:07:10 crc kubenswrapper[4964]: I1004 03:07:10.853728 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:07:10 crc kubenswrapper[4964]: I1004 03:07:10.854227 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:07:10 crc kubenswrapper[4964]: E1004 03:07:10.854534 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:07:10 crc kubenswrapper[4964]: I1004 03:07:10.978375 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbvx8\" (UniqueName: \"kubernetes.io/projected/7c408fde-7339-474f-a9c4-c028570fbd40-kube-api-access-hbvx8\") pod \"7c408fde-7339-474f-a9c4-c028570fbd40\" (UID: \"7c408fde-7339-474f-a9c4-c028570fbd40\") " Oct 04 03:07:10 crc kubenswrapper[4964]: I1004 03:07:10.978572 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c408fde-7339-474f-a9c4-c028570fbd40-inventory\") pod \"7c408fde-7339-474f-a9c4-c028570fbd40\" (UID: \"7c408fde-7339-474f-a9c4-c028570fbd40\") " Oct 04 03:07:10 crc kubenswrapper[4964]: I1004 03:07:10.978635 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7c408fde-7339-474f-a9c4-c028570fbd40-ssh-key\") pod \"7c408fde-7339-474f-a9c4-c028570fbd40\" (UID: \"7c408fde-7339-474f-a9c4-c028570fbd40\") " Oct 04 03:07:10 crc kubenswrapper[4964]: I1004 03:07:10.987197 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c408fde-7339-474f-a9c4-c028570fbd40-kube-api-access-hbvx8" (OuterVolumeSpecName: "kube-api-access-hbvx8") pod "7c408fde-7339-474f-a9c4-c028570fbd40" (UID: "7c408fde-7339-474f-a9c4-c028570fbd40"). InnerVolumeSpecName "kube-api-access-hbvx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:07:11 crc kubenswrapper[4964]: I1004 03:07:11.016331 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c408fde-7339-474f-a9c4-c028570fbd40-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7c408fde-7339-474f-a9c4-c028570fbd40" (UID: "7c408fde-7339-474f-a9c4-c028570fbd40"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:07:11 crc kubenswrapper[4964]: I1004 03:07:11.054093 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c408fde-7339-474f-a9c4-c028570fbd40-inventory" (OuterVolumeSpecName: "inventory") pod "7c408fde-7339-474f-a9c4-c028570fbd40" (UID: "7c408fde-7339-474f-a9c4-c028570fbd40"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:07:11 crc kubenswrapper[4964]: I1004 03:07:11.066752 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8sd9c"] Oct 04 03:07:11 crc kubenswrapper[4964]: I1004 03:07:11.080049 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8sd9c"] Oct 04 03:07:11 crc kubenswrapper[4964]: I1004 03:07:11.081066 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c408fde-7339-474f-a9c4-c028570fbd40-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:07:11 crc kubenswrapper[4964]: I1004 03:07:11.081110 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7c408fde-7339-474f-a9c4-c028570fbd40-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:07:11 crc kubenswrapper[4964]: I1004 03:07:11.081132 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbvx8\" (UniqueName: \"kubernetes.io/projected/7c408fde-7339-474f-a9c4-c028570fbd40-kube-api-access-hbvx8\") on node \"crc\" DevicePath \"\"" Oct 04 03:07:11 crc kubenswrapper[4964]: I1004 03:07:11.299955 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" event={"ID":"7c408fde-7339-474f-a9c4-c028570fbd40","Type":"ContainerDied","Data":"e5edbe4e135d394eb69780deba2d0a6892d686f9af8fce6f7e5f73646f97d20f"} Oct 04 03:07:11 crc kubenswrapper[4964]: I1004 03:07:11.300014 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5edbe4e135d394eb69780deba2d0a6892d686f9af8fce6f7e5f73646f97d20f" Oct 04 03:07:11 crc kubenswrapper[4964]: I1004 03:07:11.300039 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw" Oct 04 03:07:12 crc kubenswrapper[4964]: I1004 03:07:12.857731 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481d6463-07ac-427e-b5f6-f85143ebf2e0" path="/var/lib/kubelet/pods/481d6463-07ac-427e-b5f6-f85143ebf2e0/volumes" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.064052 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9"] Oct 04 03:07:18 crc kubenswrapper[4964]: E1004 03:07:18.065181 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c408fde-7339-474f-a9c4-c028570fbd40" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.065203 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c408fde-7339-474f-a9c4-c028570fbd40" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.065462 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c408fde-7339-474f-a9c4-c028570fbd40" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.066192 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.070295 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.073327 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9"] Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.073851 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.073939 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.074163 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.146524 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcqvz\" (UniqueName: \"kubernetes.io/projected/84dbc334-c57b-42f9-98c0-13ec5973b663-kube-api-access-xcqvz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9\" (UID: \"84dbc334-c57b-42f9-98c0-13ec5973b663\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.146993 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84dbc334-c57b-42f9-98c0-13ec5973b663-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9\" (UID: \"84dbc334-c57b-42f9-98c0-13ec5973b663\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.147128 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84dbc334-c57b-42f9-98c0-13ec5973b663-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9\" (UID: \"84dbc334-c57b-42f9-98c0-13ec5973b663\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.248968 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcqvz\" (UniqueName: \"kubernetes.io/projected/84dbc334-c57b-42f9-98c0-13ec5973b663-kube-api-access-xcqvz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9\" (UID: \"84dbc334-c57b-42f9-98c0-13ec5973b663\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.249252 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84dbc334-c57b-42f9-98c0-13ec5973b663-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9\" (UID: \"84dbc334-c57b-42f9-98c0-13ec5973b663\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.249308 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84dbc334-c57b-42f9-98c0-13ec5973b663-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9\" (UID: \"84dbc334-c57b-42f9-98c0-13ec5973b663\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.259595 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84dbc334-c57b-42f9-98c0-13ec5973b663-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9\" (UID: \"84dbc334-c57b-42f9-98c0-13ec5973b663\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.265839 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84dbc334-c57b-42f9-98c0-13ec5973b663-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9\" (UID: \"84dbc334-c57b-42f9-98c0-13ec5973b663\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.269585 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcqvz\" (UniqueName: \"kubernetes.io/projected/84dbc334-c57b-42f9-98c0-13ec5973b663-kube-api-access-xcqvz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9\" (UID: \"84dbc334-c57b-42f9-98c0-13ec5973b663\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.397507 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:07:18 crc kubenswrapper[4964]: I1004 03:07:18.768195 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9"] Oct 04 03:07:19 crc kubenswrapper[4964]: I1004 03:07:19.390236 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" event={"ID":"84dbc334-c57b-42f9-98c0-13ec5973b663","Type":"ContainerStarted","Data":"f2f64335836a569dfc468ab4bc808256c7824433854ed93cf8609355dd1b5ab1"} Oct 04 03:07:20 crc kubenswrapper[4964]: I1004 03:07:20.403266 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" event={"ID":"84dbc334-c57b-42f9-98c0-13ec5973b663","Type":"ContainerStarted","Data":"deeda032b90e865fd09c445f6fa677b983d3db6072b30a22eb282f658b1795de"} Oct 04 03:07:20 crc kubenswrapper[4964]: I1004 03:07:20.434791 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" podStartSLOduration=1.962426706 podStartE2EDuration="2.434770596s" podCreationTimestamp="2025-10-04 03:07:18 +0000 UTC" firstStartedPulling="2025-10-04 03:07:18.770065393 +0000 UTC m=+1618.667024071" lastFinishedPulling="2025-10-04 03:07:19.242409313 +0000 UTC m=+1619.139367961" observedRunningTime="2025-10-04 03:07:20.421807046 +0000 UTC m=+1620.318765684" watchObservedRunningTime="2025-10-04 03:07:20.434770596 +0000 UTC m=+1620.331729234" Oct 04 03:07:23 crc kubenswrapper[4964]: I1004 03:07:23.047568 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-c8kkp"] Oct 04 03:07:23 crc kubenswrapper[4964]: I1004 03:07:23.063015 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-c8kkp"] Oct 04 03:07:24 crc kubenswrapper[4964]: I1004 03:07:24.856919 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="969137a9-7e00-4472-8582-8008c5647750" path="/var/lib/kubelet/pods/969137a9-7e00-4472-8582-8008c5647750/volumes" Oct 04 03:07:25 crc kubenswrapper[4964]: I1004 03:07:25.845388 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:07:25 crc kubenswrapper[4964]: E1004 03:07:25.845793 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:07:26 crc kubenswrapper[4964]: I1004 03:07:26.600535 4964 scope.go:117] "RemoveContainer" containerID="957246df7e11551087d052bc3fc241dd0461dbec53b8b0cae9066791872c0f1d" Oct 04 03:07:26 crc kubenswrapper[4964]: I1004 03:07:26.654648 4964 scope.go:117] "RemoveContainer" containerID="439ef8ae51fee2e8b76d9e1c18c5d37ab3626298dffab2ced570b3a0a35acd72" Oct 04 03:07:26 crc kubenswrapper[4964]: I1004 03:07:26.730342 4964 scope.go:117] "RemoveContainer" containerID="5615a921688ae3cdb7bc4a565d9b1f2299a8d75acd232e359ce34b10078cabe3" Oct 04 03:07:26 crc kubenswrapper[4964]: I1004 03:07:26.766369 4964 scope.go:117] "RemoveContainer" containerID="15598fc34d35e2a401f8c568b75aa8bb19da3754d84279443a3b1b103d206005" Oct 04 03:07:26 crc kubenswrapper[4964]: I1004 03:07:26.821010 4964 scope.go:117] "RemoveContainer" containerID="cf10e2cd976ae055f82275fbfd7b235ee1d866f99cfd501646f693bc6318d43e" Oct 04 03:07:26 crc kubenswrapper[4964]: I1004 03:07:26.891146 4964 scope.go:117] "RemoveContainer" containerID="e68be7bfb795edc45c2a960b9d954fe9c1136dd9361a3c072827abb06f40bf86" Oct 04 03:07:26 crc kubenswrapper[4964]: I1004 03:07:26.927748 4964 scope.go:117] "RemoveContainer" containerID="039b184bc3f8d943465149fe94df5806f19d5ecaa5316c9422a4b54ff08f2756" Oct 04 03:07:26 crc kubenswrapper[4964]: I1004 03:07:26.966309 4964 scope.go:117] "RemoveContainer" containerID="f65cf10e157fc8a8d0f0471e5ad0d73fb0c460ac3b9630ce9cc8295cd2f61e5a" Oct 04 03:07:27 crc kubenswrapper[4964]: I1004 03:07:27.001560 4964 scope.go:117] "RemoveContainer" containerID="ae4aa1114a4130bec97e3f870bdf46e359790cb91d51b471447033f7b62ec2ca" Oct 04 03:07:38 crc kubenswrapper[4964]: I1004 03:07:38.048766 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-bxqjz"] Oct 04 03:07:38 crc kubenswrapper[4964]: I1004 03:07:38.060442 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-bxqjz"] Oct 04 03:07:38 crc kubenswrapper[4964]: I1004 03:07:38.863391 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59df1870-e2cf-41d0-9fc9-185801b5fd6f" path="/var/lib/kubelet/pods/59df1870-e2cf-41d0-9fc9-185801b5fd6f/volumes" Oct 04 03:07:39 crc kubenswrapper[4964]: I1004 03:07:39.049317 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-6sn6p"] Oct 04 03:07:39 crc kubenswrapper[4964]: I1004 03:07:39.060179 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-km2jg"] Oct 04 03:07:39 crc kubenswrapper[4964]: I1004 03:07:39.068652 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-km2jg"] Oct 04 03:07:39 crc kubenswrapper[4964]: I1004 03:07:39.076456 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-6sn6p"] Oct 04 03:07:39 crc kubenswrapper[4964]: I1004 03:07:39.845627 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:07:39 crc kubenswrapper[4964]: E1004 03:07:39.845881 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:07:40 crc kubenswrapper[4964]: I1004 03:07:40.866792 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51b046b9-b373-4654-9f0f-ff28fc2d754c" path="/var/lib/kubelet/pods/51b046b9-b373-4654-9f0f-ff28fc2d754c/volumes" Oct 04 03:07:40 crc kubenswrapper[4964]: I1004 03:07:40.867898 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a64aceca-7ea4-4919-ba95-1c2a0349361b" path="/var/lib/kubelet/pods/a64aceca-7ea4-4919-ba95-1c2a0349361b/volumes" Oct 04 03:07:49 crc kubenswrapper[4964]: I1004 03:07:49.051198 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b2c0-account-create-6zpbp"] Oct 04 03:07:49 crc kubenswrapper[4964]: I1004 03:07:49.061170 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-eba9-account-create-sqtvk"] Oct 04 03:07:49 crc kubenswrapper[4964]: I1004 03:07:49.070050 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-eba9-account-create-sqtvk"] Oct 04 03:07:49 crc kubenswrapper[4964]: I1004 03:07:49.079423 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b2c0-account-create-6zpbp"] Oct 04 03:07:50 crc kubenswrapper[4964]: I1004 03:07:50.858520 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:07:50 crc kubenswrapper[4964]: E1004 03:07:50.859647 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:07:50 crc kubenswrapper[4964]: I1004 03:07:50.860084 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5da7c2b0-bdcb-434a-a875-02f2d7143a32" path="/var/lib/kubelet/pods/5da7c2b0-bdcb-434a-a875-02f2d7143a32/volumes" Oct 04 03:07:50 crc kubenswrapper[4964]: I1004 03:07:50.861334 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a1304b-a278-4813-9fa1-de46d59f5f87" path="/var/lib/kubelet/pods/b7a1304b-a278-4813-9fa1-de46d59f5f87/volumes" Oct 04 03:07:58 crc kubenswrapper[4964]: I1004 03:07:58.048587 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-21c7-account-create-gpnbv"] Oct 04 03:07:58 crc kubenswrapper[4964]: I1004 03:07:58.055638 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-21c7-account-create-gpnbv"] Oct 04 03:07:58 crc kubenswrapper[4964]: I1004 03:07:58.857100 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73278f79-5723-43d3-8038-0dc5479356cb" path="/var/lib/kubelet/pods/73278f79-5723-43d3-8038-0dc5479356cb/volumes" Oct 04 03:08:02 crc kubenswrapper[4964]: I1004 03:08:02.846215 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:08:02 crc kubenswrapper[4964]: E1004 03:08:02.846773 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:08:06 crc kubenswrapper[4964]: I1004 03:08:06.897180 4964 generic.go:334] "Generic (PLEG): container finished" podID="84dbc334-c57b-42f9-98c0-13ec5973b663" containerID="deeda032b90e865fd09c445f6fa677b983d3db6072b30a22eb282f658b1795de" exitCode=0 Oct 04 03:08:06 crc kubenswrapper[4964]: I1004 03:08:06.897433 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" event={"ID":"84dbc334-c57b-42f9-98c0-13ec5973b663","Type":"ContainerDied","Data":"deeda032b90e865fd09c445f6fa677b983d3db6072b30a22eb282f658b1795de"} Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.388553 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.478211 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcqvz\" (UniqueName: \"kubernetes.io/projected/84dbc334-c57b-42f9-98c0-13ec5973b663-kube-api-access-xcqvz\") pod \"84dbc334-c57b-42f9-98c0-13ec5973b663\" (UID: \"84dbc334-c57b-42f9-98c0-13ec5973b663\") " Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.478577 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84dbc334-c57b-42f9-98c0-13ec5973b663-inventory\") pod \"84dbc334-c57b-42f9-98c0-13ec5973b663\" (UID: \"84dbc334-c57b-42f9-98c0-13ec5973b663\") " Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.478664 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84dbc334-c57b-42f9-98c0-13ec5973b663-ssh-key\") pod \"84dbc334-c57b-42f9-98c0-13ec5973b663\" (UID: \"84dbc334-c57b-42f9-98c0-13ec5973b663\") " Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.486578 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84dbc334-c57b-42f9-98c0-13ec5973b663-kube-api-access-xcqvz" (OuterVolumeSpecName: "kube-api-access-xcqvz") pod "84dbc334-c57b-42f9-98c0-13ec5973b663" (UID: "84dbc334-c57b-42f9-98c0-13ec5973b663"). InnerVolumeSpecName "kube-api-access-xcqvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.502747 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84dbc334-c57b-42f9-98c0-13ec5973b663-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "84dbc334-c57b-42f9-98c0-13ec5973b663" (UID: "84dbc334-c57b-42f9-98c0-13ec5973b663"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.529321 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84dbc334-c57b-42f9-98c0-13ec5973b663-inventory" (OuterVolumeSpecName: "inventory") pod "84dbc334-c57b-42f9-98c0-13ec5973b663" (UID: "84dbc334-c57b-42f9-98c0-13ec5973b663"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.580958 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcqvz\" (UniqueName: \"kubernetes.io/projected/84dbc334-c57b-42f9-98c0-13ec5973b663-kube-api-access-xcqvz\") on node \"crc\" DevicePath \"\"" Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.580992 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84dbc334-c57b-42f9-98c0-13ec5973b663-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.581003 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84dbc334-c57b-42f9-98c0-13ec5973b663-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.942788 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" event={"ID":"84dbc334-c57b-42f9-98c0-13ec5973b663","Type":"ContainerDied","Data":"f2f64335836a569dfc468ab4bc808256c7824433854ed93cf8609355dd1b5ab1"} Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.942851 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2f64335836a569dfc468ab4bc808256c7824433854ed93cf8609355dd1b5ab1" Oct 04 03:08:08 crc kubenswrapper[4964]: I1004 03:08:08.942884 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.049513 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qrlzf"] Oct 04 03:08:09 crc kubenswrapper[4964]: E1004 03:08:09.050217 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84dbc334-c57b-42f9-98c0-13ec5973b663" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.050240 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="84dbc334-c57b-42f9-98c0-13ec5973b663" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.050483 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="84dbc334-c57b-42f9-98c0-13ec5973b663" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.051269 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.053473 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.054649 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.055165 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.058932 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.072484 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qrlzf"] Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.205572 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlzfh\" (UniqueName: \"kubernetes.io/projected/35d4e215-b999-46e5-925b-1746875e31ab-kube-api-access-xlzfh\") pod \"ssh-known-hosts-edpm-deployment-qrlzf\" (UID: \"35d4e215-b999-46e5-925b-1746875e31ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.205671 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35d4e215-b999-46e5-925b-1746875e31ab-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qrlzf\" (UID: \"35d4e215-b999-46e5-925b-1746875e31ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.205796 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35d4e215-b999-46e5-925b-1746875e31ab-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qrlzf\" (UID: \"35d4e215-b999-46e5-925b-1746875e31ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.307929 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35d4e215-b999-46e5-925b-1746875e31ab-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qrlzf\" (UID: \"35d4e215-b999-46e5-925b-1746875e31ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.308278 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlzfh\" (UniqueName: \"kubernetes.io/projected/35d4e215-b999-46e5-925b-1746875e31ab-kube-api-access-xlzfh\") pod \"ssh-known-hosts-edpm-deployment-qrlzf\" (UID: \"35d4e215-b999-46e5-925b-1746875e31ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.308383 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35d4e215-b999-46e5-925b-1746875e31ab-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qrlzf\" (UID: \"35d4e215-b999-46e5-925b-1746875e31ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.320489 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35d4e215-b999-46e5-925b-1746875e31ab-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qrlzf\" (UID: \"35d4e215-b999-46e5-925b-1746875e31ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.320861 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35d4e215-b999-46e5-925b-1746875e31ab-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qrlzf\" (UID: \"35d4e215-b999-46e5-925b-1746875e31ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.325884 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlzfh\" (UniqueName: \"kubernetes.io/projected/35d4e215-b999-46e5-925b-1746875e31ab-kube-api-access-xlzfh\") pod \"ssh-known-hosts-edpm-deployment-qrlzf\" (UID: \"35d4e215-b999-46e5-925b-1746875e31ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.376945 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.930469 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qrlzf"] Oct 04 03:08:09 crc kubenswrapper[4964]: I1004 03:08:09.953456 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" event={"ID":"35d4e215-b999-46e5-925b-1746875e31ab","Type":"ContainerStarted","Data":"05d72b526fe203389ac400e898ff45acb02241f3f9eb6f50df7d5e51842104f3"} Oct 04 03:08:10 crc kubenswrapper[4964]: I1004 03:08:10.968479 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" event={"ID":"35d4e215-b999-46e5-925b-1746875e31ab","Type":"ContainerStarted","Data":"95ffe0590adef79f10c470bd4bb72231a33f6a70cc4ff1736e4bc3b73a2a25ba"} Oct 04 03:08:11 crc kubenswrapper[4964]: I1004 03:08:11.001536 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" podStartSLOduration=1.451435282 podStartE2EDuration="2.001510088s" podCreationTimestamp="2025-10-04 03:08:09 +0000 UTC" firstStartedPulling="2025-10-04 03:08:09.938274589 +0000 UTC m=+1669.835233237" lastFinishedPulling="2025-10-04 03:08:10.488349355 +0000 UTC m=+1670.385308043" observedRunningTime="2025-10-04 03:08:10.992711897 +0000 UTC m=+1670.889670565" watchObservedRunningTime="2025-10-04 03:08:11.001510088 +0000 UTC m=+1670.898468746" Oct 04 03:08:14 crc kubenswrapper[4964]: I1004 03:08:14.055432 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5kmn5"] Oct 04 03:08:14 crc kubenswrapper[4964]: I1004 03:08:14.065508 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5kmn5"] Oct 04 03:08:14 crc kubenswrapper[4964]: I1004 03:08:14.858724 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19923833-9456-4f37-b45f-95d76e8b8483" path="/var/lib/kubelet/pods/19923833-9456-4f37-b45f-95d76e8b8483/volumes" Oct 04 03:08:17 crc kubenswrapper[4964]: I1004 03:08:17.845940 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:08:17 crc kubenswrapper[4964]: E1004 03:08:17.846500 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:08:19 crc kubenswrapper[4964]: I1004 03:08:19.127302 4964 generic.go:334] "Generic (PLEG): container finished" podID="35d4e215-b999-46e5-925b-1746875e31ab" containerID="95ffe0590adef79f10c470bd4bb72231a33f6a70cc4ff1736e4bc3b73a2a25ba" exitCode=0 Oct 04 03:08:19 crc kubenswrapper[4964]: I1004 03:08:19.127353 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" event={"ID":"35d4e215-b999-46e5-925b-1746875e31ab","Type":"ContainerDied","Data":"95ffe0590adef79f10c470bd4bb72231a33f6a70cc4ff1736e4bc3b73a2a25ba"} Oct 04 03:08:20 crc kubenswrapper[4964]: I1004 03:08:20.580647 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:20 crc kubenswrapper[4964]: I1004 03:08:20.663668 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35d4e215-b999-46e5-925b-1746875e31ab-ssh-key-openstack-edpm-ipam\") pod \"35d4e215-b999-46e5-925b-1746875e31ab\" (UID: \"35d4e215-b999-46e5-925b-1746875e31ab\") " Oct 04 03:08:20 crc kubenswrapper[4964]: I1004 03:08:20.663727 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35d4e215-b999-46e5-925b-1746875e31ab-inventory-0\") pod \"35d4e215-b999-46e5-925b-1746875e31ab\" (UID: \"35d4e215-b999-46e5-925b-1746875e31ab\") " Oct 04 03:08:20 crc kubenswrapper[4964]: I1004 03:08:20.663795 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlzfh\" (UniqueName: \"kubernetes.io/projected/35d4e215-b999-46e5-925b-1746875e31ab-kube-api-access-xlzfh\") pod \"35d4e215-b999-46e5-925b-1746875e31ab\" (UID: \"35d4e215-b999-46e5-925b-1746875e31ab\") " Oct 04 03:08:20 crc kubenswrapper[4964]: I1004 03:08:20.669433 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d4e215-b999-46e5-925b-1746875e31ab-kube-api-access-xlzfh" (OuterVolumeSpecName: "kube-api-access-xlzfh") pod "35d4e215-b999-46e5-925b-1746875e31ab" (UID: "35d4e215-b999-46e5-925b-1746875e31ab"). InnerVolumeSpecName "kube-api-access-xlzfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:08:20 crc kubenswrapper[4964]: I1004 03:08:20.689194 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d4e215-b999-46e5-925b-1746875e31ab-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "35d4e215-b999-46e5-925b-1746875e31ab" (UID: "35d4e215-b999-46e5-925b-1746875e31ab"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:08:20 crc kubenswrapper[4964]: I1004 03:08:20.697230 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d4e215-b999-46e5-925b-1746875e31ab-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "35d4e215-b999-46e5-925b-1746875e31ab" (UID: "35d4e215-b999-46e5-925b-1746875e31ab"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:08:20 crc kubenswrapper[4964]: I1004 03:08:20.766547 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35d4e215-b999-46e5-925b-1746875e31ab-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 04 03:08:20 crc kubenswrapper[4964]: I1004 03:08:20.766597 4964 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35d4e215-b999-46e5-925b-1746875e31ab-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:08:20 crc kubenswrapper[4964]: I1004 03:08:20.766646 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlzfh\" (UniqueName: \"kubernetes.io/projected/35d4e215-b999-46e5-925b-1746875e31ab-kube-api-access-xlzfh\") on node \"crc\" DevicePath \"\"" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.150856 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" event={"ID":"35d4e215-b999-46e5-925b-1746875e31ab","Type":"ContainerDied","Data":"05d72b526fe203389ac400e898ff45acb02241f3f9eb6f50df7d5e51842104f3"} Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.151278 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05d72b526fe203389ac400e898ff45acb02241f3f9eb6f50df7d5e51842104f3" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.150895 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qrlzf" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.238727 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw"] Oct 04 03:08:21 crc kubenswrapper[4964]: E1004 03:08:21.239428 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d4e215-b999-46e5-925b-1746875e31ab" containerName="ssh-known-hosts-edpm-deployment" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.239469 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d4e215-b999-46e5-925b-1746875e31ab" containerName="ssh-known-hosts-edpm-deployment" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.239898 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d4e215-b999-46e5-925b-1746875e31ab" containerName="ssh-known-hosts-edpm-deployment" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.241148 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.245056 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.245071 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.245174 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.246093 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.252430 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw"] Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.378346 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/862b7563-669d-4dc0-8c61-3916f5d463c1-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbncw\" (UID: \"862b7563-669d-4dc0-8c61-3916f5d463c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.378526 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/862b7563-669d-4dc0-8c61-3916f5d463c1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbncw\" (UID: \"862b7563-669d-4dc0-8c61-3916f5d463c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.378968 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpg46\" (UniqueName: \"kubernetes.io/projected/862b7563-669d-4dc0-8c61-3916f5d463c1-kube-api-access-qpg46\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbncw\" (UID: \"862b7563-669d-4dc0-8c61-3916f5d463c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.481079 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpg46\" (UniqueName: \"kubernetes.io/projected/862b7563-669d-4dc0-8c61-3916f5d463c1-kube-api-access-qpg46\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbncw\" (UID: \"862b7563-669d-4dc0-8c61-3916f5d463c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.481251 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/862b7563-669d-4dc0-8c61-3916f5d463c1-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbncw\" (UID: \"862b7563-669d-4dc0-8c61-3916f5d463c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.481351 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/862b7563-669d-4dc0-8c61-3916f5d463c1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbncw\" (UID: \"862b7563-669d-4dc0-8c61-3916f5d463c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.489072 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/862b7563-669d-4dc0-8c61-3916f5d463c1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbncw\" (UID: \"862b7563-669d-4dc0-8c61-3916f5d463c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.492718 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/862b7563-669d-4dc0-8c61-3916f5d463c1-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbncw\" (UID: \"862b7563-669d-4dc0-8c61-3916f5d463c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.500456 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpg46\" (UniqueName: \"kubernetes.io/projected/862b7563-669d-4dc0-8c61-3916f5d463c1-kube-api-access-qpg46\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbncw\" (UID: \"862b7563-669d-4dc0-8c61-3916f5d463c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:21 crc kubenswrapper[4964]: I1004 03:08:21.561716 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:22 crc kubenswrapper[4964]: I1004 03:08:22.181764 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw"] Oct 04 03:08:22 crc kubenswrapper[4964]: W1004 03:08:22.186930 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod862b7563_669d_4dc0_8c61_3916f5d463c1.slice/crio-05a43c16e0305b35bf28376450871050225997723f2a1dd1b76a799020cbdc63 WatchSource:0}: Error finding container 05a43c16e0305b35bf28376450871050225997723f2a1dd1b76a799020cbdc63: Status 404 returned error can't find the container with id 05a43c16e0305b35bf28376450871050225997723f2a1dd1b76a799020cbdc63 Oct 04 03:08:23 crc kubenswrapper[4964]: I1004 03:08:23.171787 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" event={"ID":"862b7563-669d-4dc0-8c61-3916f5d463c1","Type":"ContainerStarted","Data":"0493986c83556f6bbe17cac8fbb444893d74fc3e897e1d4f007ab5caaa9dd7ad"} Oct 04 03:08:23 crc kubenswrapper[4964]: I1004 03:08:23.172378 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" event={"ID":"862b7563-669d-4dc0-8c61-3916f5d463c1","Type":"ContainerStarted","Data":"05a43c16e0305b35bf28376450871050225997723f2a1dd1b76a799020cbdc63"} Oct 04 03:08:23 crc kubenswrapper[4964]: I1004 03:08:23.195344 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" podStartSLOduration=1.685513375 podStartE2EDuration="2.195324111s" podCreationTimestamp="2025-10-04 03:08:21 +0000 UTC" firstStartedPulling="2025-10-04 03:08:22.18902754 +0000 UTC m=+1682.085986208" lastFinishedPulling="2025-10-04 03:08:22.698838266 +0000 UTC m=+1682.595796944" observedRunningTime="2025-10-04 03:08:23.193349089 +0000 UTC m=+1683.090307767" watchObservedRunningTime="2025-10-04 03:08:23.195324111 +0000 UTC m=+1683.092282759" Oct 04 03:08:27 crc kubenswrapper[4964]: I1004 03:08:27.234816 4964 scope.go:117] "RemoveContainer" containerID="ed8ec2815bf7580d8357c25af4facf618bc9d71e7a6d6e8afaf17a5506c78a2f" Oct 04 03:08:27 crc kubenswrapper[4964]: I1004 03:08:27.284671 4964 scope.go:117] "RemoveContainer" containerID="bba4ab83986c42295d794c2667eeac32b25ff3dae674743d0a9ea7ba18d2289f" Oct 04 03:08:27 crc kubenswrapper[4964]: I1004 03:08:27.359980 4964 scope.go:117] "RemoveContainer" containerID="5c68b018a8b5c0240dfac8ba2cbee259480f24f37a90864ad11bf69a4e32ab8f" Oct 04 03:08:27 crc kubenswrapper[4964]: I1004 03:08:27.406050 4964 scope.go:117] "RemoveContainer" containerID="eee8449bc1dfc078ae6707261e07bbf8d7c0abdfb6122bce2f087847138ce321" Oct 04 03:08:27 crc kubenswrapper[4964]: I1004 03:08:27.467163 4964 scope.go:117] "RemoveContainer" containerID="7e5fabbe5bc6bff557d221b7c43be2fca0c1ca264a4a84cacd7b9cc30424c81c" Oct 04 03:08:27 crc kubenswrapper[4964]: I1004 03:08:27.488915 4964 scope.go:117] "RemoveContainer" containerID="58002698ab53561d18086c5eeb1c4a6e7348d28d57a523ce7015d8ed9cb13239" Oct 04 03:08:27 crc kubenswrapper[4964]: I1004 03:08:27.544593 4964 scope.go:117] "RemoveContainer" containerID="0c6f5e4885fc7704a61a7e7c551198c3b95a97f3733bc4cf18c5d66621aacde9" Oct 04 03:08:28 crc kubenswrapper[4964]: I1004 03:08:28.846017 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:08:28 crc kubenswrapper[4964]: E1004 03:08:28.846587 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:08:30 crc kubenswrapper[4964]: I1004 03:08:30.047279 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-ch92k"] Oct 04 03:08:30 crc kubenswrapper[4964]: I1004 03:08:30.069771 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-ch92k"] Oct 04 03:08:30 crc kubenswrapper[4964]: I1004 03:08:30.866364 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b2b94b6-a8c2-43d3-b5b6-10e02544fe47" path="/var/lib/kubelet/pods/3b2b94b6-a8c2-43d3-b5b6-10e02544fe47/volumes" Oct 04 03:08:31 crc kubenswrapper[4964]: I1004 03:08:31.050595 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zb6n7"] Oct 04 03:08:31 crc kubenswrapper[4964]: I1004 03:08:31.060704 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zb6n7"] Oct 04 03:08:32 crc kubenswrapper[4964]: I1004 03:08:32.288138 4964 generic.go:334] "Generic (PLEG): container finished" podID="862b7563-669d-4dc0-8c61-3916f5d463c1" containerID="0493986c83556f6bbe17cac8fbb444893d74fc3e897e1d4f007ab5caaa9dd7ad" exitCode=0 Oct 04 03:08:32 crc kubenswrapper[4964]: I1004 03:08:32.288188 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" event={"ID":"862b7563-669d-4dc0-8c61-3916f5d463c1","Type":"ContainerDied","Data":"0493986c83556f6bbe17cac8fbb444893d74fc3e897e1d4f007ab5caaa9dd7ad"} Oct 04 03:08:32 crc kubenswrapper[4964]: I1004 03:08:32.867188 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc8e261-b823-4e47-8434-69659d723885" path="/var/lib/kubelet/pods/edc8e261-b823-4e47-8434-69659d723885/volumes" Oct 04 03:08:33 crc kubenswrapper[4964]: I1004 03:08:33.738044 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:33 crc kubenswrapper[4964]: I1004 03:08:33.848094 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/862b7563-669d-4dc0-8c61-3916f5d463c1-ssh-key\") pod \"862b7563-669d-4dc0-8c61-3916f5d463c1\" (UID: \"862b7563-669d-4dc0-8c61-3916f5d463c1\") " Oct 04 03:08:33 crc kubenswrapper[4964]: I1004 03:08:33.848264 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpg46\" (UniqueName: \"kubernetes.io/projected/862b7563-669d-4dc0-8c61-3916f5d463c1-kube-api-access-qpg46\") pod \"862b7563-669d-4dc0-8c61-3916f5d463c1\" (UID: \"862b7563-669d-4dc0-8c61-3916f5d463c1\") " Oct 04 03:08:33 crc kubenswrapper[4964]: I1004 03:08:33.848332 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/862b7563-669d-4dc0-8c61-3916f5d463c1-inventory\") pod \"862b7563-669d-4dc0-8c61-3916f5d463c1\" (UID: \"862b7563-669d-4dc0-8c61-3916f5d463c1\") " Oct 04 03:08:33 crc kubenswrapper[4964]: I1004 03:08:33.858082 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/862b7563-669d-4dc0-8c61-3916f5d463c1-kube-api-access-qpg46" (OuterVolumeSpecName: "kube-api-access-qpg46") pod "862b7563-669d-4dc0-8c61-3916f5d463c1" (UID: "862b7563-669d-4dc0-8c61-3916f5d463c1"). InnerVolumeSpecName "kube-api-access-qpg46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:08:33 crc kubenswrapper[4964]: I1004 03:08:33.883310 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/862b7563-669d-4dc0-8c61-3916f5d463c1-inventory" (OuterVolumeSpecName: "inventory") pod "862b7563-669d-4dc0-8c61-3916f5d463c1" (UID: "862b7563-669d-4dc0-8c61-3916f5d463c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:08:33 crc kubenswrapper[4964]: I1004 03:08:33.904699 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/862b7563-669d-4dc0-8c61-3916f5d463c1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "862b7563-669d-4dc0-8c61-3916f5d463c1" (UID: "862b7563-669d-4dc0-8c61-3916f5d463c1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:08:33 crc kubenswrapper[4964]: I1004 03:08:33.950457 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpg46\" (UniqueName: \"kubernetes.io/projected/862b7563-669d-4dc0-8c61-3916f5d463c1-kube-api-access-qpg46\") on node \"crc\" DevicePath \"\"" Oct 04 03:08:33 crc kubenswrapper[4964]: I1004 03:08:33.950504 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/862b7563-669d-4dc0-8c61-3916f5d463c1-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:08:33 crc kubenswrapper[4964]: I1004 03:08:33.950527 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/862b7563-669d-4dc0-8c61-3916f5d463c1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.314236 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" event={"ID":"862b7563-669d-4dc0-8c61-3916f5d463c1","Type":"ContainerDied","Data":"05a43c16e0305b35bf28376450871050225997723f2a1dd1b76a799020cbdc63"} Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.314301 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05a43c16e0305b35bf28376450871050225997723f2a1dd1b76a799020cbdc63" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.314305 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.423117 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n"] Oct 04 03:08:34 crc kubenswrapper[4964]: E1004 03:08:34.423511 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862b7563-669d-4dc0-8c61-3916f5d463c1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.423533 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="862b7563-669d-4dc0-8c61-3916f5d463c1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.423769 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="862b7563-669d-4dc0-8c61-3916f5d463c1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.424385 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.427444 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.427925 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.428701 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.429139 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.439734 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n"] Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.563849 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec622cf5-6792-4cc5-b1eb-f82e47af5027-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n\" (UID: \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.563899 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxsjd\" (UniqueName: \"kubernetes.io/projected/ec622cf5-6792-4cc5-b1eb-f82e47af5027-kube-api-access-wxsjd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n\" (UID: \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.564057 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec622cf5-6792-4cc5-b1eb-f82e47af5027-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n\" (UID: \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.665080 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec622cf5-6792-4cc5-b1eb-f82e47af5027-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n\" (UID: \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.665192 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec622cf5-6792-4cc5-b1eb-f82e47af5027-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n\" (UID: \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.665216 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxsjd\" (UniqueName: \"kubernetes.io/projected/ec622cf5-6792-4cc5-b1eb-f82e47af5027-kube-api-access-wxsjd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n\" (UID: \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.673707 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec622cf5-6792-4cc5-b1eb-f82e47af5027-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n\" (UID: \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.685575 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec622cf5-6792-4cc5-b1eb-f82e47af5027-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n\" (UID: \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.692968 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxsjd\" (UniqueName: \"kubernetes.io/projected/ec622cf5-6792-4cc5-b1eb-f82e47af5027-kube-api-access-wxsjd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n\" (UID: \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:34 crc kubenswrapper[4964]: I1004 03:08:34.749627 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:35 crc kubenswrapper[4964]: I1004 03:08:35.274447 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n"] Oct 04 03:08:35 crc kubenswrapper[4964]: W1004 03:08:35.282712 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec622cf5_6792_4cc5_b1eb_f82e47af5027.slice/crio-28868fe569db55d8616003725488f9ae8cc1902620c58ae0d758c2b4f7e07593 WatchSource:0}: Error finding container 28868fe569db55d8616003725488f9ae8cc1902620c58ae0d758c2b4f7e07593: Status 404 returned error can't find the container with id 28868fe569db55d8616003725488f9ae8cc1902620c58ae0d758c2b4f7e07593 Oct 04 03:08:35 crc kubenswrapper[4964]: I1004 03:08:35.328895 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" event={"ID":"ec622cf5-6792-4cc5-b1eb-f82e47af5027","Type":"ContainerStarted","Data":"28868fe569db55d8616003725488f9ae8cc1902620c58ae0d758c2b4f7e07593"} Oct 04 03:08:36 crc kubenswrapper[4964]: I1004 03:08:36.343184 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" event={"ID":"ec622cf5-6792-4cc5-b1eb-f82e47af5027","Type":"ContainerStarted","Data":"6246ea77b76fd9b65defb470a089720c73f6fd8e36721940e32bbef122d4f24c"} Oct 04 03:08:36 crc kubenswrapper[4964]: I1004 03:08:36.369114 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" podStartSLOduration=1.894648117 podStartE2EDuration="2.369096724s" podCreationTimestamp="2025-10-04 03:08:34 +0000 UTC" firstStartedPulling="2025-10-04 03:08:35.284985196 +0000 UTC m=+1695.181943874" lastFinishedPulling="2025-10-04 03:08:35.759433833 +0000 UTC m=+1695.656392481" observedRunningTime="2025-10-04 03:08:36.363744863 +0000 UTC m=+1696.260703501" watchObservedRunningTime="2025-10-04 03:08:36.369096724 +0000 UTC m=+1696.266055362" Oct 04 03:08:42 crc kubenswrapper[4964]: I1004 03:08:42.845940 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:08:42 crc kubenswrapper[4964]: E1004 03:08:42.847176 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:08:46 crc kubenswrapper[4964]: I1004 03:08:46.473526 4964 generic.go:334] "Generic (PLEG): container finished" podID="ec622cf5-6792-4cc5-b1eb-f82e47af5027" containerID="6246ea77b76fd9b65defb470a089720c73f6fd8e36721940e32bbef122d4f24c" exitCode=0 Oct 04 03:08:46 crc kubenswrapper[4964]: I1004 03:08:46.474284 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" event={"ID":"ec622cf5-6792-4cc5-b1eb-f82e47af5027","Type":"ContainerDied","Data":"6246ea77b76fd9b65defb470a089720c73f6fd8e36721940e32bbef122d4f24c"} Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.044737 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.126487 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec622cf5-6792-4cc5-b1eb-f82e47af5027-inventory\") pod \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\" (UID: \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\") " Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.126685 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec622cf5-6792-4cc5-b1eb-f82e47af5027-ssh-key\") pod \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\" (UID: \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\") " Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.126983 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxsjd\" (UniqueName: \"kubernetes.io/projected/ec622cf5-6792-4cc5-b1eb-f82e47af5027-kube-api-access-wxsjd\") pod \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\" (UID: \"ec622cf5-6792-4cc5-b1eb-f82e47af5027\") " Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.132830 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec622cf5-6792-4cc5-b1eb-f82e47af5027-kube-api-access-wxsjd" (OuterVolumeSpecName: "kube-api-access-wxsjd") pod "ec622cf5-6792-4cc5-b1eb-f82e47af5027" (UID: "ec622cf5-6792-4cc5-b1eb-f82e47af5027"). InnerVolumeSpecName "kube-api-access-wxsjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.158231 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec622cf5-6792-4cc5-b1eb-f82e47af5027-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ec622cf5-6792-4cc5-b1eb-f82e47af5027" (UID: "ec622cf5-6792-4cc5-b1eb-f82e47af5027"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.163246 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec622cf5-6792-4cc5-b1eb-f82e47af5027-inventory" (OuterVolumeSpecName: "inventory") pod "ec622cf5-6792-4cc5-b1eb-f82e47af5027" (UID: "ec622cf5-6792-4cc5-b1eb-f82e47af5027"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.229741 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec622cf5-6792-4cc5-b1eb-f82e47af5027-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.229779 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec622cf5-6792-4cc5-b1eb-f82e47af5027-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.229793 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxsjd\" (UniqueName: \"kubernetes.io/projected/ec622cf5-6792-4cc5-b1eb-f82e47af5027-kube-api-access-wxsjd\") on node \"crc\" DevicePath \"\"" Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.497329 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" event={"ID":"ec622cf5-6792-4cc5-b1eb-f82e47af5027","Type":"ContainerDied","Data":"28868fe569db55d8616003725488f9ae8cc1902620c58ae0d758c2b4f7e07593"} Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.497709 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28868fe569db55d8616003725488f9ae8cc1902620c58ae0d758c2b4f7e07593" Oct 04 03:08:48 crc kubenswrapper[4964]: I1004 03:08:48.497437 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n" Oct 04 03:08:56 crc kubenswrapper[4964]: I1004 03:08:56.846648 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:08:56 crc kubenswrapper[4964]: E1004 03:08:56.847790 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:09:11 crc kubenswrapper[4964]: I1004 03:09:11.845792 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:09:11 crc kubenswrapper[4964]: E1004 03:09:11.846763 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:09:16 crc kubenswrapper[4964]: I1004 03:09:16.062053 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ns9dn"] Oct 04 03:09:16 crc kubenswrapper[4964]: I1004 03:09:16.074467 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ns9dn"] Oct 04 03:09:16 crc kubenswrapper[4964]: I1004 03:09:16.875943 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edf8a38b-136c-4af9-b766-136bdfabd69d" path="/var/lib/kubelet/pods/edf8a38b-136c-4af9-b766-136bdfabd69d/volumes" Oct 04 03:09:23 crc kubenswrapper[4964]: I1004 03:09:23.846163 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:09:23 crc kubenswrapper[4964]: E1004 03:09:23.847391 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:09:27 crc kubenswrapper[4964]: I1004 03:09:27.683368 4964 scope.go:117] "RemoveContainer" containerID="8f91cd4dd407ac9f844c11529b2191c1c27a813b20c8ef2c6a9b0037ef82dde7" Oct 04 03:09:27 crc kubenswrapper[4964]: I1004 03:09:27.753202 4964 scope.go:117] "RemoveContainer" containerID="aadada084f99d6f23988d9418c9ae17e657a6447246d9cd6db110f53606f179c" Oct 04 03:09:27 crc kubenswrapper[4964]: I1004 03:09:27.801339 4964 scope.go:117] "RemoveContainer" containerID="36174e4e8620c40cd826109a2f529d1fe17bfffa0bf91225f780722d04ad006f" Oct 04 03:09:35 crc kubenswrapper[4964]: I1004 03:09:35.845969 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:09:35 crc kubenswrapper[4964]: E1004 03:09:35.846768 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.380482 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-54dt7"] Oct 04 03:09:48 crc kubenswrapper[4964]: E1004 03:09:48.381323 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec622cf5-6792-4cc5-b1eb-f82e47af5027" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.381340 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec622cf5-6792-4cc5-b1eb-f82e47af5027" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.381641 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec622cf5-6792-4cc5-b1eb-f82e47af5027" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.383220 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.392073 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54dt7"] Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.409666 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-utilities\") pod \"certified-operators-54dt7\" (UID: \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\") " pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.409744 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k4hn\" (UniqueName: \"kubernetes.io/projected/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-kube-api-access-7k4hn\") pod \"certified-operators-54dt7\" (UID: \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\") " pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.409791 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-catalog-content\") pod \"certified-operators-54dt7\" (UID: \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\") " pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.510901 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-utilities\") pod \"certified-operators-54dt7\" (UID: \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\") " pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.510952 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k4hn\" (UniqueName: \"kubernetes.io/projected/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-kube-api-access-7k4hn\") pod \"certified-operators-54dt7\" (UID: \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\") " pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.510982 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-catalog-content\") pod \"certified-operators-54dt7\" (UID: \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\") " pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.511399 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-catalog-content\") pod \"certified-operators-54dt7\" (UID: \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\") " pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.511686 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-utilities\") pod \"certified-operators-54dt7\" (UID: \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\") " pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.535831 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k4hn\" (UniqueName: \"kubernetes.io/projected/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-kube-api-access-7k4hn\") pod \"certified-operators-54dt7\" (UID: \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\") " pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:48 crc kubenswrapper[4964]: I1004 03:09:48.703820 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:49 crc kubenswrapper[4964]: I1004 03:09:49.217278 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54dt7"] Oct 04 03:09:49 crc kubenswrapper[4964]: W1004 03:09:49.221759 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bf3fc24_35d2_4a3c_8994_60f9938dbc32.slice/crio-95377d0100dc32fa34dc349a23843054396a40c326cbf532da4690ae45fa4050 WatchSource:0}: Error finding container 95377d0100dc32fa34dc349a23843054396a40c326cbf532da4690ae45fa4050: Status 404 returned error can't find the container with id 95377d0100dc32fa34dc349a23843054396a40c326cbf532da4690ae45fa4050 Oct 04 03:09:50 crc kubenswrapper[4964]: I1004 03:09:50.154810 4964 generic.go:334] "Generic (PLEG): container finished" podID="6bf3fc24-35d2-4a3c-8994-60f9938dbc32" containerID="0e0760669455846b27aefd6c91ff696119076ff592cb3f3e9d45996f6328109c" exitCode=0 Oct 04 03:09:50 crc kubenswrapper[4964]: I1004 03:09:50.154876 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54dt7" event={"ID":"6bf3fc24-35d2-4a3c-8994-60f9938dbc32","Type":"ContainerDied","Data":"0e0760669455846b27aefd6c91ff696119076ff592cb3f3e9d45996f6328109c"} Oct 04 03:09:50 crc kubenswrapper[4964]: I1004 03:09:50.155256 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54dt7" event={"ID":"6bf3fc24-35d2-4a3c-8994-60f9938dbc32","Type":"ContainerStarted","Data":"95377d0100dc32fa34dc349a23843054396a40c326cbf532da4690ae45fa4050"} Oct 04 03:09:50 crc kubenswrapper[4964]: I1004 03:09:50.885679 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:09:50 crc kubenswrapper[4964]: E1004 03:09:50.886842 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:09:51 crc kubenswrapper[4964]: I1004 03:09:51.165279 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54dt7" event={"ID":"6bf3fc24-35d2-4a3c-8994-60f9938dbc32","Type":"ContainerStarted","Data":"4aab61348e49be61ca856d81cf3dfba3ff04f1005570da12d292304dd20a72df"} Oct 04 03:09:52 crc kubenswrapper[4964]: I1004 03:09:52.176507 4964 generic.go:334] "Generic (PLEG): container finished" podID="6bf3fc24-35d2-4a3c-8994-60f9938dbc32" containerID="4aab61348e49be61ca856d81cf3dfba3ff04f1005570da12d292304dd20a72df" exitCode=0 Oct 04 03:09:52 crc kubenswrapper[4964]: I1004 03:09:52.176860 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54dt7" event={"ID":"6bf3fc24-35d2-4a3c-8994-60f9938dbc32","Type":"ContainerDied","Data":"4aab61348e49be61ca856d81cf3dfba3ff04f1005570da12d292304dd20a72df"} Oct 04 03:09:53 crc kubenswrapper[4964]: I1004 03:09:53.191183 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54dt7" event={"ID":"6bf3fc24-35d2-4a3c-8994-60f9938dbc32","Type":"ContainerStarted","Data":"67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17"} Oct 04 03:09:53 crc kubenswrapper[4964]: I1004 03:09:53.209138 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-54dt7" podStartSLOduration=2.790649454 podStartE2EDuration="5.209122432s" podCreationTimestamp="2025-10-04 03:09:48 +0000 UTC" firstStartedPulling="2025-10-04 03:09:50.158246056 +0000 UTC m=+1770.055204704" lastFinishedPulling="2025-10-04 03:09:52.576719014 +0000 UTC m=+1772.473677682" observedRunningTime="2025-10-04 03:09:53.207301244 +0000 UTC m=+1773.104259942" watchObservedRunningTime="2025-10-04 03:09:53.209122432 +0000 UTC m=+1773.106081070" Oct 04 03:09:58 crc kubenswrapper[4964]: I1004 03:09:58.704135 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:58 crc kubenswrapper[4964]: I1004 03:09:58.704713 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:58 crc kubenswrapper[4964]: I1004 03:09:58.789847 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:59 crc kubenswrapper[4964]: I1004 03:09:59.312540 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:09:59 crc kubenswrapper[4964]: I1004 03:09:59.375062 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-54dt7"] Oct 04 03:10:01 crc kubenswrapper[4964]: I1004 03:10:01.273199 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-54dt7" podUID="6bf3fc24-35d2-4a3c-8994-60f9938dbc32" containerName="registry-server" containerID="cri-o://67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17" gracePeriod=2 Oct 04 03:10:01 crc kubenswrapper[4964]: I1004 03:10:01.867817 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:10:01 crc kubenswrapper[4964]: I1004 03:10:01.913749 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-utilities\") pod \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\" (UID: \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\") " Oct 04 03:10:01 crc kubenswrapper[4964]: I1004 03:10:01.914179 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k4hn\" (UniqueName: \"kubernetes.io/projected/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-kube-api-access-7k4hn\") pod \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\" (UID: \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\") " Oct 04 03:10:01 crc kubenswrapper[4964]: I1004 03:10:01.914393 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-catalog-content\") pod \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\" (UID: \"6bf3fc24-35d2-4a3c-8994-60f9938dbc32\") " Oct 04 03:10:01 crc kubenswrapper[4964]: I1004 03:10:01.914875 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-utilities" (OuterVolumeSpecName: "utilities") pod "6bf3fc24-35d2-4a3c-8994-60f9938dbc32" (UID: "6bf3fc24-35d2-4a3c-8994-60f9938dbc32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:10:01 crc kubenswrapper[4964]: I1004 03:10:01.916315 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:10:01 crc kubenswrapper[4964]: I1004 03:10:01.921263 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-kube-api-access-7k4hn" (OuterVolumeSpecName: "kube-api-access-7k4hn") pod "6bf3fc24-35d2-4a3c-8994-60f9938dbc32" (UID: "6bf3fc24-35d2-4a3c-8994-60f9938dbc32"). InnerVolumeSpecName "kube-api-access-7k4hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:10:01 crc kubenswrapper[4964]: I1004 03:10:01.984998 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bf3fc24-35d2-4a3c-8994-60f9938dbc32" (UID: "6bf3fc24-35d2-4a3c-8994-60f9938dbc32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.017793 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.017834 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k4hn\" (UniqueName: \"kubernetes.io/projected/6bf3fc24-35d2-4a3c-8994-60f9938dbc32-kube-api-access-7k4hn\") on node \"crc\" DevicePath \"\"" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.287581 4964 generic.go:334] "Generic (PLEG): container finished" podID="6bf3fc24-35d2-4a3c-8994-60f9938dbc32" containerID="67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17" exitCode=0 Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.287672 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54dt7" event={"ID":"6bf3fc24-35d2-4a3c-8994-60f9938dbc32","Type":"ContainerDied","Data":"67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17"} Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.287722 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54dt7" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.287756 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54dt7" event={"ID":"6bf3fc24-35d2-4a3c-8994-60f9938dbc32","Type":"ContainerDied","Data":"95377d0100dc32fa34dc349a23843054396a40c326cbf532da4690ae45fa4050"} Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.287798 4964 scope.go:117] "RemoveContainer" containerID="67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.324796 4964 scope.go:117] "RemoveContainer" containerID="4aab61348e49be61ca856d81cf3dfba3ff04f1005570da12d292304dd20a72df" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.372648 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-54dt7"] Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.381976 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-54dt7"] Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.390064 4964 scope.go:117] "RemoveContainer" containerID="0e0760669455846b27aefd6c91ff696119076ff592cb3f3e9d45996f6328109c" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.441738 4964 scope.go:117] "RemoveContainer" containerID="67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17" Oct 04 03:10:02 crc kubenswrapper[4964]: E1004 03:10:02.443022 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17\": container with ID starting with 67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17 not found: ID does not exist" containerID="67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.443079 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17"} err="failed to get container status \"67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17\": rpc error: code = NotFound desc = could not find container \"67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17\": container with ID starting with 67bf9fa04eccbca28cf32b689b9c1d84cb679dc23e0ae3c1e9136a3ca2c87f17 not found: ID does not exist" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.443112 4964 scope.go:117] "RemoveContainer" containerID="4aab61348e49be61ca856d81cf3dfba3ff04f1005570da12d292304dd20a72df" Oct 04 03:10:02 crc kubenswrapper[4964]: E1004 03:10:02.445597 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aab61348e49be61ca856d81cf3dfba3ff04f1005570da12d292304dd20a72df\": container with ID starting with 4aab61348e49be61ca856d81cf3dfba3ff04f1005570da12d292304dd20a72df not found: ID does not exist" containerID="4aab61348e49be61ca856d81cf3dfba3ff04f1005570da12d292304dd20a72df" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.445656 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aab61348e49be61ca856d81cf3dfba3ff04f1005570da12d292304dd20a72df"} err="failed to get container status \"4aab61348e49be61ca856d81cf3dfba3ff04f1005570da12d292304dd20a72df\": rpc error: code = NotFound desc = could not find container \"4aab61348e49be61ca856d81cf3dfba3ff04f1005570da12d292304dd20a72df\": container with ID starting with 4aab61348e49be61ca856d81cf3dfba3ff04f1005570da12d292304dd20a72df not found: ID does not exist" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.445677 4964 scope.go:117] "RemoveContainer" containerID="0e0760669455846b27aefd6c91ff696119076ff592cb3f3e9d45996f6328109c" Oct 04 03:10:02 crc kubenswrapper[4964]: E1004 03:10:02.446032 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e0760669455846b27aefd6c91ff696119076ff592cb3f3e9d45996f6328109c\": container with ID starting with 0e0760669455846b27aefd6c91ff696119076ff592cb3f3e9d45996f6328109c not found: ID does not exist" containerID="0e0760669455846b27aefd6c91ff696119076ff592cb3f3e9d45996f6328109c" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.446058 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e0760669455846b27aefd6c91ff696119076ff592cb3f3e9d45996f6328109c"} err="failed to get container status \"0e0760669455846b27aefd6c91ff696119076ff592cb3f3e9d45996f6328109c\": rpc error: code = NotFound desc = could not find container \"0e0760669455846b27aefd6c91ff696119076ff592cb3f3e9d45996f6328109c\": container with ID starting with 0e0760669455846b27aefd6c91ff696119076ff592cb3f3e9d45996f6328109c not found: ID does not exist" Oct 04 03:10:02 crc kubenswrapper[4964]: I1004 03:10:02.860549 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bf3fc24-35d2-4a3c-8994-60f9938dbc32" path="/var/lib/kubelet/pods/6bf3fc24-35d2-4a3c-8994-60f9938dbc32/volumes" Oct 04 03:10:05 crc kubenswrapper[4964]: I1004 03:10:05.845288 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:10:05 crc kubenswrapper[4964]: E1004 03:10:05.846362 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:10:20 crc kubenswrapper[4964]: I1004 03:10:20.858031 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:10:20 crc kubenswrapper[4964]: E1004 03:10:20.859196 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:10:34 crc kubenswrapper[4964]: I1004 03:10:34.846258 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:10:35 crc kubenswrapper[4964]: I1004 03:10:35.665525 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"808711bbb9b5290bb75b5f959551d10a31c81e759a35e4ab4abfe0ca4eaf4a72"} Oct 04 03:13:04 crc kubenswrapper[4964]: I1004 03:13:04.449183 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:13:04 crc kubenswrapper[4964]: I1004 03:13:04.450095 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.886109 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.899366 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jdtnh"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.912737 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qrlzf"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.920353 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.926471 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qrlzf"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.932228 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.955747 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.964161 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hfxvw"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.972071 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.979123 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.984192 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.989313 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.994649 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw"] Oct 04 03:13:15 crc kubenswrapper[4964]: I1004 03:13:15.999437 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-krz9n"] Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.004228 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gbmxx"] Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.009120 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5"] Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.014371 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-p5jgq"] Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.020199 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxxdw"] Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.026229 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7nkh"] Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.031898 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbncw"] Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.036988 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wwdb9"] Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.041602 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgxk5"] Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.864957 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f3f08c-cbee-4b5a-9f06-60c9bf2978c4" path="/var/lib/kubelet/pods/01f3f08c-cbee-4b5a-9f06-60c9bf2978c4/volumes" Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.867121 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dba0c3e-e5d7-4b39-a2c6-403a93020983" path="/var/lib/kubelet/pods/2dba0c3e-e5d7-4b39-a2c6-403a93020983/volumes" Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.868498 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35d4e215-b999-46e5-925b-1746875e31ab" path="/var/lib/kubelet/pods/35d4e215-b999-46e5-925b-1746875e31ab/volumes" Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.869700 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50426398-6c3b-4482-a791-f5f98ec0f076" path="/var/lib/kubelet/pods/50426398-6c3b-4482-a791-f5f98ec0f076/volumes" Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.871608 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590233c5-797b-4b2c-a0e1-c9123b45ba6e" path="/var/lib/kubelet/pods/590233c5-797b-4b2c-a0e1-c9123b45ba6e/volumes" Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.872212 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c408fde-7339-474f-a9c4-c028570fbd40" path="/var/lib/kubelet/pods/7c408fde-7339-474f-a9c4-c028570fbd40/volumes" Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.872854 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ffdf276-5128-4a2e-8ccf-18210ada6acf" path="/var/lib/kubelet/pods/7ffdf276-5128-4a2e-8ccf-18210ada6acf/volumes" Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.873965 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84dbc334-c57b-42f9-98c0-13ec5973b663" path="/var/lib/kubelet/pods/84dbc334-c57b-42f9-98c0-13ec5973b663/volumes" Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.874454 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="862b7563-669d-4dc0-8c61-3916f5d463c1" path="/var/lib/kubelet/pods/862b7563-669d-4dc0-8c61-3916f5d463c1/volumes" Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.875054 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b6c804-8b89-4692-b6a2-54f606a700a8" path="/var/lib/kubelet/pods/a3b6c804-8b89-4692-b6a2-54f606a700a8/volumes" Oct 04 03:13:16 crc kubenswrapper[4964]: I1004 03:13:16.876010 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec622cf5-6792-4cc5-b1eb-f82e47af5027" path="/var/lib/kubelet/pods/ec622cf5-6792-4cc5-b1eb-f82e47af5027/volumes" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.816488 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg"] Oct 04 03:13:21 crc kubenswrapper[4964]: E1004 03:13:21.817383 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf3fc24-35d2-4a3c-8994-60f9938dbc32" containerName="registry-server" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.817398 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf3fc24-35d2-4a3c-8994-60f9938dbc32" containerName="registry-server" Oct 04 03:13:21 crc kubenswrapper[4964]: E1004 03:13:21.817426 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf3fc24-35d2-4a3c-8994-60f9938dbc32" containerName="extract-content" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.817435 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf3fc24-35d2-4a3c-8994-60f9938dbc32" containerName="extract-content" Oct 04 03:13:21 crc kubenswrapper[4964]: E1004 03:13:21.817466 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf3fc24-35d2-4a3c-8994-60f9938dbc32" containerName="extract-utilities" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.817474 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf3fc24-35d2-4a3c-8994-60f9938dbc32" containerName="extract-utilities" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.817680 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bf3fc24-35d2-4a3c-8994-60f9938dbc32" containerName="registry-server" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.818321 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.820116 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.820461 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.820581 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.820869 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.821090 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.833140 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg"] Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.980259 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.980400 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.980419 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9tft\" (UniqueName: \"kubernetes.io/projected/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-kube-api-access-q9tft\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.980491 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:21 crc kubenswrapper[4964]: I1004 03:13:21.980560 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.082371 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.082448 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.082469 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9tft\" (UniqueName: \"kubernetes.io/projected/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-kube-api-access-q9tft\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.082503 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.082546 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.088810 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.089167 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.089995 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.094159 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.099644 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9tft\" (UniqueName: \"kubernetes.io/projected/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-kube-api-access-q9tft\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.134914 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:22 crc kubenswrapper[4964]: W1004 03:13:22.649771 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f74c64d_ceb1_4e0b_b1e3_dad667f2bd7a.slice/crio-3df50ac07c5a4b3ef211e8c03e647da24faddd8fd2347c0105e0fa3ed39e80de WatchSource:0}: Error finding container 3df50ac07c5a4b3ef211e8c03e647da24faddd8fd2347c0105e0fa3ed39e80de: Status 404 returned error can't find the container with id 3df50ac07c5a4b3ef211e8c03e647da24faddd8fd2347c0105e0fa3ed39e80de Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.652382 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 03:13:22 crc kubenswrapper[4964]: I1004 03:13:22.653940 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg"] Oct 04 03:13:23 crc kubenswrapper[4964]: I1004 03:13:23.471275 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" event={"ID":"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a","Type":"ContainerStarted","Data":"682f1378a8c7e963b128e931daa74a7bd99badfa58120823a6f35920e8073ddb"} Oct 04 03:13:23 crc kubenswrapper[4964]: I1004 03:13:23.471547 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" event={"ID":"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a","Type":"ContainerStarted","Data":"3df50ac07c5a4b3ef211e8c03e647da24faddd8fd2347c0105e0fa3ed39e80de"} Oct 04 03:13:23 crc kubenswrapper[4964]: I1004 03:13:23.496111 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" podStartSLOduration=1.974044248 podStartE2EDuration="2.496088432s" podCreationTimestamp="2025-10-04 03:13:21 +0000 UTC" firstStartedPulling="2025-10-04 03:13:22.65204351 +0000 UTC m=+1982.549002158" lastFinishedPulling="2025-10-04 03:13:23.174087674 +0000 UTC m=+1983.071046342" observedRunningTime="2025-10-04 03:13:23.491465159 +0000 UTC m=+1983.388423797" watchObservedRunningTime="2025-10-04 03:13:23.496088432 +0000 UTC m=+1983.393047070" Oct 04 03:13:28 crc kubenswrapper[4964]: I1004 03:13:28.016165 4964 scope.go:117] "RemoveContainer" containerID="5b99ce90eb7266f89e6c9bf905b48a6c5c4d4f12f1dfe0b37020b5bd30e410b4" Oct 04 03:13:28 crc kubenswrapper[4964]: I1004 03:13:28.073094 4964 scope.go:117] "RemoveContainer" containerID="1321d6376d84230cda55e9a6943cc1f92e5653e51833ca64edb440b72491ee27" Oct 04 03:13:28 crc kubenswrapper[4964]: I1004 03:13:28.121989 4964 scope.go:117] "RemoveContainer" containerID="2f464a09fcb93ec8c7bf0719420802609ee2b7a6c1c66d8d15f0deccabd86bd6" Oct 04 03:13:28 crc kubenswrapper[4964]: I1004 03:13:28.191146 4964 scope.go:117] "RemoveContainer" containerID="fc93f8897872fbc7302743cac1691820166790a59b5676def40e260ba6fe729f" Oct 04 03:13:28 crc kubenswrapper[4964]: I1004 03:13:28.243535 4964 scope.go:117] "RemoveContainer" containerID="bf4f6baf90ed202d3070a26fd63fd75c14dcee462ccb781ea45612d8796b29ff" Oct 04 03:13:28 crc kubenswrapper[4964]: I1004 03:13:28.272381 4964 scope.go:117] "RemoveContainer" containerID="90b7151b7982209568cd6b2c39af571f19cd429a8334b88edd32faed69f09b6a" Oct 04 03:13:28 crc kubenswrapper[4964]: I1004 03:13:28.329791 4964 scope.go:117] "RemoveContainer" containerID="f93716ed7b6c7d61cf7d3834df510d8076b52b3ef9ba5656f18ec9c5a5a6145c" Oct 04 03:13:28 crc kubenswrapper[4964]: I1004 03:13:28.376403 4964 scope.go:117] "RemoveContainer" containerID="deeda032b90e865fd09c445f6fa677b983d3db6072b30a22eb282f658b1795de" Oct 04 03:13:34 crc kubenswrapper[4964]: I1004 03:13:34.449850 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:13:34 crc kubenswrapper[4964]: I1004 03:13:34.450290 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:13:35 crc kubenswrapper[4964]: I1004 03:13:35.599672 4964 generic.go:334] "Generic (PLEG): container finished" podID="1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a" containerID="682f1378a8c7e963b128e931daa74a7bd99badfa58120823a6f35920e8073ddb" exitCode=0 Oct 04 03:13:35 crc kubenswrapper[4964]: I1004 03:13:35.599774 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" event={"ID":"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a","Type":"ContainerDied","Data":"682f1378a8c7e963b128e931daa74a7bd99badfa58120823a6f35920e8073ddb"} Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.029830 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.189594 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-inventory\") pod \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.190192 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-ceph\") pod \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.190721 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-ssh-key\") pod \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.191563 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-repo-setup-combined-ca-bundle\") pod \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.191935 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9tft\" (UniqueName: \"kubernetes.io/projected/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-kube-api-access-q9tft\") pod \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\" (UID: \"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a\") " Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.196921 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-ceph" (OuterVolumeSpecName: "ceph") pod "1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a" (UID: "1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.197846 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-kube-api-access-q9tft" (OuterVolumeSpecName: "kube-api-access-q9tft") pod "1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a" (UID: "1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a"). InnerVolumeSpecName "kube-api-access-q9tft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.197927 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a" (UID: "1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.236003 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a" (UID: "1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.243470 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-inventory" (OuterVolumeSpecName: "inventory") pod "1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a" (UID: "1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.295322 4964 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.295367 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9tft\" (UniqueName: \"kubernetes.io/projected/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-kube-api-access-q9tft\") on node \"crc\" DevicePath \"\"" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.295388 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.295406 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.295421 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.624586 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.624592 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg" event={"ID":"1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a","Type":"ContainerDied","Data":"3df50ac07c5a4b3ef211e8c03e647da24faddd8fd2347c0105e0fa3ed39e80de"} Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.625108 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3df50ac07c5a4b3ef211e8c03e647da24faddd8fd2347c0105e0fa3ed39e80de" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.751394 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd"] Oct 04 03:13:37 crc kubenswrapper[4964]: E1004 03:13:37.752006 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.752118 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.752430 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.753305 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.756980 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.757434 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.757842 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.758391 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.758481 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.795222 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd"] Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.905208 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.905312 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dl2d\" (UniqueName: \"kubernetes.io/projected/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-kube-api-access-4dl2d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.905346 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.905529 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:37 crc kubenswrapper[4964]: I1004 03:13:37.905601 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.007799 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.007912 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.008113 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.008247 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dl2d\" (UniqueName: \"kubernetes.io/projected/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-kube-api-access-4dl2d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.008305 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.018898 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.022136 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.022965 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.028532 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.058314 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dl2d\" (UniqueName: \"kubernetes.io/projected/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-kube-api-access-4dl2d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.086658 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.411094 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd"] Oct 04 03:13:38 crc kubenswrapper[4964]: I1004 03:13:38.633823 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" event={"ID":"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb","Type":"ContainerStarted","Data":"717afb2f10e0ddabf3d3ec55d9af492e173149985bcc629e7d51bbcf7d02de16"} Oct 04 03:13:39 crc kubenswrapper[4964]: I1004 03:13:39.652145 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" event={"ID":"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb","Type":"ContainerStarted","Data":"abe4d22465be3a3f53ce18562479733ff5933c6f4b89ca18e132e3ea106bcf50"} Oct 04 03:13:39 crc kubenswrapper[4964]: I1004 03:13:39.694774 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" podStartSLOduration=1.8404852680000001 podStartE2EDuration="2.694748962s" podCreationTimestamp="2025-10-04 03:13:37 +0000 UTC" firstStartedPulling="2025-10-04 03:13:38.418449242 +0000 UTC m=+1998.315407880" lastFinishedPulling="2025-10-04 03:13:39.272712896 +0000 UTC m=+1999.169671574" observedRunningTime="2025-10-04 03:13:39.686277928 +0000 UTC m=+1999.583236586" watchObservedRunningTime="2025-10-04 03:13:39.694748962 +0000 UTC m=+1999.591707640" Oct 04 03:14:04 crc kubenswrapper[4964]: I1004 03:14:04.449797 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:14:04 crc kubenswrapper[4964]: I1004 03:14:04.450580 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:14:04 crc kubenswrapper[4964]: I1004 03:14:04.450660 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 03:14:04 crc kubenswrapper[4964]: I1004 03:14:04.451606 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"808711bbb9b5290bb75b5f959551d10a31c81e759a35e4ab4abfe0ca4eaf4a72"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 03:14:04 crc kubenswrapper[4964]: I1004 03:14:04.451687 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://808711bbb9b5290bb75b5f959551d10a31c81e759a35e4ab4abfe0ca4eaf4a72" gracePeriod=600 Oct 04 03:14:04 crc kubenswrapper[4964]: I1004 03:14:04.912984 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="808711bbb9b5290bb75b5f959551d10a31c81e759a35e4ab4abfe0ca4eaf4a72" exitCode=0 Oct 04 03:14:04 crc kubenswrapper[4964]: I1004 03:14:04.913064 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"808711bbb9b5290bb75b5f959551d10a31c81e759a35e4ab4abfe0ca4eaf4a72"} Oct 04 03:14:04 crc kubenswrapper[4964]: I1004 03:14:04.913440 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844"} Oct 04 03:14:04 crc kubenswrapper[4964]: I1004 03:14:04.913467 4964 scope.go:117] "RemoveContainer" containerID="556ecd5a0dbf016384cad6ea30da0c78f417a97a933725f8d5c34e9fe0280e16" Oct 04 03:14:28 crc kubenswrapper[4964]: I1004 03:14:28.565799 4964 scope.go:117] "RemoveContainer" containerID="95ffe0590adef79f10c470bd4bb72231a33f6a70cc4ff1736e4bc3b73a2a25ba" Oct 04 03:14:28 crc kubenswrapper[4964]: I1004 03:14:28.619716 4964 scope.go:117] "RemoveContainer" containerID="0493986c83556f6bbe17cac8fbb444893d74fc3e897e1d4f007ab5caaa9dd7ad" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.165748 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v"] Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.168042 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.170339 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.171889 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.182187 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v"] Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.258049 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x8bw\" (UniqueName: \"kubernetes.io/projected/23077962-587a-4a71-9672-222d7348d24f-kube-api-access-4x8bw\") pod \"collect-profiles-29325795-2p95v\" (UID: \"23077962-587a-4a71-9672-222d7348d24f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.258418 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23077962-587a-4a71-9672-222d7348d24f-config-volume\") pod \"collect-profiles-29325795-2p95v\" (UID: \"23077962-587a-4a71-9672-222d7348d24f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.258474 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23077962-587a-4a71-9672-222d7348d24f-secret-volume\") pod \"collect-profiles-29325795-2p95v\" (UID: \"23077962-587a-4a71-9672-222d7348d24f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.361102 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x8bw\" (UniqueName: \"kubernetes.io/projected/23077962-587a-4a71-9672-222d7348d24f-kube-api-access-4x8bw\") pod \"collect-profiles-29325795-2p95v\" (UID: \"23077962-587a-4a71-9672-222d7348d24f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.361244 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23077962-587a-4a71-9672-222d7348d24f-config-volume\") pod \"collect-profiles-29325795-2p95v\" (UID: \"23077962-587a-4a71-9672-222d7348d24f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.361352 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23077962-587a-4a71-9672-222d7348d24f-secret-volume\") pod \"collect-profiles-29325795-2p95v\" (UID: \"23077962-587a-4a71-9672-222d7348d24f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.362341 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23077962-587a-4a71-9672-222d7348d24f-config-volume\") pod \"collect-profiles-29325795-2p95v\" (UID: \"23077962-587a-4a71-9672-222d7348d24f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.370096 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23077962-587a-4a71-9672-222d7348d24f-secret-volume\") pod \"collect-profiles-29325795-2p95v\" (UID: \"23077962-587a-4a71-9672-222d7348d24f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.381172 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x8bw\" (UniqueName: \"kubernetes.io/projected/23077962-587a-4a71-9672-222d7348d24f-kube-api-access-4x8bw\") pod \"collect-profiles-29325795-2p95v\" (UID: \"23077962-587a-4a71-9672-222d7348d24f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.493631 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:00 crc kubenswrapper[4964]: I1004 03:15:00.984792 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v"] Oct 04 03:15:01 crc kubenswrapper[4964]: I1004 03:15:01.515071 4964 generic.go:334] "Generic (PLEG): container finished" podID="23077962-587a-4a71-9672-222d7348d24f" containerID="8af6b7d7aa13c6bfe0d1aaf4da6590fdb82dc9075245c051a7224278065c4a69" exitCode=0 Oct 04 03:15:01 crc kubenswrapper[4964]: I1004 03:15:01.515209 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" event={"ID":"23077962-587a-4a71-9672-222d7348d24f","Type":"ContainerDied","Data":"8af6b7d7aa13c6bfe0d1aaf4da6590fdb82dc9075245c051a7224278065c4a69"} Oct 04 03:15:01 crc kubenswrapper[4964]: I1004 03:15:01.515428 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" event={"ID":"23077962-587a-4a71-9672-222d7348d24f","Type":"ContainerStarted","Data":"80e85c7af7b79dccaea7354cf520f0a4a8db998bf1b3f38330d73252620f9990"} Oct 04 03:15:02 crc kubenswrapper[4964]: I1004 03:15:02.948779 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.019802 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23077962-587a-4a71-9672-222d7348d24f-secret-volume\") pod \"23077962-587a-4a71-9672-222d7348d24f\" (UID: \"23077962-587a-4a71-9672-222d7348d24f\") " Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.019887 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23077962-587a-4a71-9672-222d7348d24f-config-volume\") pod \"23077962-587a-4a71-9672-222d7348d24f\" (UID: \"23077962-587a-4a71-9672-222d7348d24f\") " Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.020032 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x8bw\" (UniqueName: \"kubernetes.io/projected/23077962-587a-4a71-9672-222d7348d24f-kube-api-access-4x8bw\") pod \"23077962-587a-4a71-9672-222d7348d24f\" (UID: \"23077962-587a-4a71-9672-222d7348d24f\") " Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.022085 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23077962-587a-4a71-9672-222d7348d24f-config-volume" (OuterVolumeSpecName: "config-volume") pod "23077962-587a-4a71-9672-222d7348d24f" (UID: "23077962-587a-4a71-9672-222d7348d24f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.027642 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23077962-587a-4a71-9672-222d7348d24f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "23077962-587a-4a71-9672-222d7348d24f" (UID: "23077962-587a-4a71-9672-222d7348d24f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.032771 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23077962-587a-4a71-9672-222d7348d24f-kube-api-access-4x8bw" (OuterVolumeSpecName: "kube-api-access-4x8bw") pod "23077962-587a-4a71-9672-222d7348d24f" (UID: "23077962-587a-4a71-9672-222d7348d24f"). InnerVolumeSpecName "kube-api-access-4x8bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.094472 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qs2mq"] Oct 04 03:15:03 crc kubenswrapper[4964]: E1004 03:15:03.094914 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23077962-587a-4a71-9672-222d7348d24f" containerName="collect-profiles" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.094928 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="23077962-587a-4a71-9672-222d7348d24f" containerName="collect-profiles" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.095140 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="23077962-587a-4a71-9672-222d7348d24f" containerName="collect-profiles" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.096602 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.122733 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x8bw\" (UniqueName: \"kubernetes.io/projected/23077962-587a-4a71-9672-222d7348d24f-kube-api-access-4x8bw\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.122794 4964 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23077962-587a-4a71-9672-222d7348d24f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.122808 4964 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23077962-587a-4a71-9672-222d7348d24f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.125557 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qs2mq"] Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.224130 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwt9k\" (UniqueName: \"kubernetes.io/projected/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-kube-api-access-jwt9k\") pod \"redhat-marketplace-qs2mq\" (UID: \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\") " pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.224232 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-utilities\") pod \"redhat-marketplace-qs2mq\" (UID: \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\") " pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.224268 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-catalog-content\") pod \"redhat-marketplace-qs2mq\" (UID: \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\") " pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.326870 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwt9k\" (UniqueName: \"kubernetes.io/projected/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-kube-api-access-jwt9k\") pod \"redhat-marketplace-qs2mq\" (UID: \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\") " pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.326972 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-utilities\") pod \"redhat-marketplace-qs2mq\" (UID: \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\") " pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.327004 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-catalog-content\") pod \"redhat-marketplace-qs2mq\" (UID: \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\") " pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.327769 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-catalog-content\") pod \"redhat-marketplace-qs2mq\" (UID: \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\") " pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.327788 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-utilities\") pod \"redhat-marketplace-qs2mq\" (UID: \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\") " pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.347164 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwt9k\" (UniqueName: \"kubernetes.io/projected/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-kube-api-access-jwt9k\") pod \"redhat-marketplace-qs2mq\" (UID: \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\") " pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.451278 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.549682 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" event={"ID":"23077962-587a-4a71-9672-222d7348d24f","Type":"ContainerDied","Data":"80e85c7af7b79dccaea7354cf520f0a4a8db998bf1b3f38330d73252620f9990"} Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.550061 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80e85c7af7b79dccaea7354cf520f0a4a8db998bf1b3f38330d73252620f9990" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.550133 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v" Oct 04 03:15:03 crc kubenswrapper[4964]: I1004 03:15:03.899092 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qs2mq"] Oct 04 03:15:03 crc kubenswrapper[4964]: W1004 03:15:03.909186 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc074dbc_2e1f_43c8_a612_e69a8c29e1a2.slice/crio-21d7e5aa4c3483e7e3bddf6ec385d9f26688eda6249d9c85c26744cc7f7debc5 WatchSource:0}: Error finding container 21d7e5aa4c3483e7e3bddf6ec385d9f26688eda6249d9c85c26744cc7f7debc5: Status 404 returned error can't find the container with id 21d7e5aa4c3483e7e3bddf6ec385d9f26688eda6249d9c85c26744cc7f7debc5 Oct 04 03:15:04 crc kubenswrapper[4964]: I1004 03:15:04.015182 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc"] Oct 04 03:15:04 crc kubenswrapper[4964]: I1004 03:15:04.024633 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325750-kkskc"] Oct 04 03:15:04 crc kubenswrapper[4964]: I1004 03:15:04.560056 4964 generic.go:334] "Generic (PLEG): container finished" podID="cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" containerID="8c5d04bb4f2e087a1f82a9cedc874da3c15215043b4f42f3fedd7fc2581149ba" exitCode=0 Oct 04 03:15:04 crc kubenswrapper[4964]: I1004 03:15:04.560098 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qs2mq" event={"ID":"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2","Type":"ContainerDied","Data":"8c5d04bb4f2e087a1f82a9cedc874da3c15215043b4f42f3fedd7fc2581149ba"} Oct 04 03:15:04 crc kubenswrapper[4964]: I1004 03:15:04.560121 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qs2mq" event={"ID":"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2","Type":"ContainerStarted","Data":"21d7e5aa4c3483e7e3bddf6ec385d9f26688eda6249d9c85c26744cc7f7debc5"} Oct 04 03:15:04 crc kubenswrapper[4964]: I1004 03:15:04.860911 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77f14c97-6ee6-4ff8-b98d-3c8cd595b994" path="/var/lib/kubelet/pods/77f14c97-6ee6-4ff8-b98d-3c8cd595b994/volumes" Oct 04 03:15:06 crc kubenswrapper[4964]: I1004 03:15:06.596274 4964 generic.go:334] "Generic (PLEG): container finished" podID="cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" containerID="cc21b6853a760182e9bcb40ead851ec7f30327e3fa6c56c68e193344899de340" exitCode=0 Oct 04 03:15:06 crc kubenswrapper[4964]: I1004 03:15:06.596734 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qs2mq" event={"ID":"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2","Type":"ContainerDied","Data":"cc21b6853a760182e9bcb40ead851ec7f30327e3fa6c56c68e193344899de340"} Oct 04 03:15:07 crc kubenswrapper[4964]: I1004 03:15:07.611482 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qs2mq" event={"ID":"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2","Type":"ContainerStarted","Data":"f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f"} Oct 04 03:15:07 crc kubenswrapper[4964]: I1004 03:15:07.639096 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qs2mq" podStartSLOduration=2.207570219 podStartE2EDuration="4.639080014s" podCreationTimestamp="2025-10-04 03:15:03 +0000 UTC" firstStartedPulling="2025-10-04 03:15:04.561801945 +0000 UTC m=+2084.458760583" lastFinishedPulling="2025-10-04 03:15:06.99331174 +0000 UTC m=+2086.890270378" observedRunningTime="2025-10-04 03:15:07.633920687 +0000 UTC m=+2087.530879355" watchObservedRunningTime="2025-10-04 03:15:07.639080014 +0000 UTC m=+2087.536038652" Oct 04 03:15:13 crc kubenswrapper[4964]: I1004 03:15:13.452795 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:13 crc kubenswrapper[4964]: I1004 03:15:13.453385 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:13 crc kubenswrapper[4964]: I1004 03:15:13.529011 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:13 crc kubenswrapper[4964]: I1004 03:15:13.737522 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:13 crc kubenswrapper[4964]: I1004 03:15:13.790808 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qs2mq"] Oct 04 03:15:15 crc kubenswrapper[4964]: I1004 03:15:15.702368 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qs2mq" podUID="cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" containerName="registry-server" containerID="cri-o://f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f" gracePeriod=2 Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.313283 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.389334 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-utilities\") pod \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\" (UID: \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\") " Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.389576 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwt9k\" (UniqueName: \"kubernetes.io/projected/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-kube-api-access-jwt9k\") pod \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\" (UID: \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\") " Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.389790 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-catalog-content\") pod \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\" (UID: \"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2\") " Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.391000 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-utilities" (OuterVolumeSpecName: "utilities") pod "cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" (UID: "cc074dbc-2e1f-43c8-a612-e69a8c29e1a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.391551 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.400592 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-kube-api-access-jwt9k" (OuterVolumeSpecName: "kube-api-access-jwt9k") pod "cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" (UID: "cc074dbc-2e1f-43c8-a612-e69a8c29e1a2"). InnerVolumeSpecName "kube-api-access-jwt9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.408808 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" (UID: "cc074dbc-2e1f-43c8-a612-e69a8c29e1a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.493802 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwt9k\" (UniqueName: \"kubernetes.io/projected/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-kube-api-access-jwt9k\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.494059 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.710810 4964 generic.go:334] "Generic (PLEG): container finished" podID="ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb" containerID="abe4d22465be3a3f53ce18562479733ff5933c6f4b89ca18e132e3ea106bcf50" exitCode=0 Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.710875 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" event={"ID":"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb","Type":"ContainerDied","Data":"abe4d22465be3a3f53ce18562479733ff5933c6f4b89ca18e132e3ea106bcf50"} Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.713192 4964 generic.go:334] "Generic (PLEG): container finished" podID="cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" containerID="f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f" exitCode=0 Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.713230 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qs2mq" event={"ID":"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2","Type":"ContainerDied","Data":"f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f"} Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.713251 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qs2mq" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.713271 4964 scope.go:117] "RemoveContainer" containerID="f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.713257 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qs2mq" event={"ID":"cc074dbc-2e1f-43c8-a612-e69a8c29e1a2","Type":"ContainerDied","Data":"21d7e5aa4c3483e7e3bddf6ec385d9f26688eda6249d9c85c26744cc7f7debc5"} Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.729984 4964 scope.go:117] "RemoveContainer" containerID="cc21b6853a760182e9bcb40ead851ec7f30327e3fa6c56c68e193344899de340" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.759968 4964 scope.go:117] "RemoveContainer" containerID="8c5d04bb4f2e087a1f82a9cedc874da3c15215043b4f42f3fedd7fc2581149ba" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.780851 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qs2mq"] Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.787736 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qs2mq"] Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.793083 4964 scope.go:117] "RemoveContainer" containerID="f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f" Oct 04 03:15:16 crc kubenswrapper[4964]: E1004 03:15:16.793517 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f\": container with ID starting with f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f not found: ID does not exist" containerID="f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.793546 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f"} err="failed to get container status \"f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f\": rpc error: code = NotFound desc = could not find container \"f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f\": container with ID starting with f5ae1498ca3aa100a8375ef2a0c81a4834e1e9c79d8e0531680ede06831e5e1f not found: ID does not exist" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.793566 4964 scope.go:117] "RemoveContainer" containerID="cc21b6853a760182e9bcb40ead851ec7f30327e3fa6c56c68e193344899de340" Oct 04 03:15:16 crc kubenswrapper[4964]: E1004 03:15:16.793835 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc21b6853a760182e9bcb40ead851ec7f30327e3fa6c56c68e193344899de340\": container with ID starting with cc21b6853a760182e9bcb40ead851ec7f30327e3fa6c56c68e193344899de340 not found: ID does not exist" containerID="cc21b6853a760182e9bcb40ead851ec7f30327e3fa6c56c68e193344899de340" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.793860 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc21b6853a760182e9bcb40ead851ec7f30327e3fa6c56c68e193344899de340"} err="failed to get container status \"cc21b6853a760182e9bcb40ead851ec7f30327e3fa6c56c68e193344899de340\": rpc error: code = NotFound desc = could not find container \"cc21b6853a760182e9bcb40ead851ec7f30327e3fa6c56c68e193344899de340\": container with ID starting with cc21b6853a760182e9bcb40ead851ec7f30327e3fa6c56c68e193344899de340 not found: ID does not exist" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.793872 4964 scope.go:117] "RemoveContainer" containerID="8c5d04bb4f2e087a1f82a9cedc874da3c15215043b4f42f3fedd7fc2581149ba" Oct 04 03:15:16 crc kubenswrapper[4964]: E1004 03:15:16.794217 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5d04bb4f2e087a1f82a9cedc874da3c15215043b4f42f3fedd7fc2581149ba\": container with ID starting with 8c5d04bb4f2e087a1f82a9cedc874da3c15215043b4f42f3fedd7fc2581149ba not found: ID does not exist" containerID="8c5d04bb4f2e087a1f82a9cedc874da3c15215043b4f42f3fedd7fc2581149ba" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.794235 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5d04bb4f2e087a1f82a9cedc874da3c15215043b4f42f3fedd7fc2581149ba"} err="failed to get container status \"8c5d04bb4f2e087a1f82a9cedc874da3c15215043b4f42f3fedd7fc2581149ba\": rpc error: code = NotFound desc = could not find container \"8c5d04bb4f2e087a1f82a9cedc874da3c15215043b4f42f3fedd7fc2581149ba\": container with ID starting with 8c5d04bb4f2e087a1f82a9cedc874da3c15215043b4f42f3fedd7fc2581149ba not found: ID does not exist" Oct 04 03:15:16 crc kubenswrapper[4964]: I1004 03:15:16.853729 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" path="/var/lib/kubelet/pods/cc074dbc-2e1f-43c8-a612-e69a8c29e1a2/volumes" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.257182 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.332889 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-ceph\") pod \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.333019 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-ssh-key\") pod \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.333196 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-bootstrap-combined-ca-bundle\") pod \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.333265 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dl2d\" (UniqueName: \"kubernetes.io/projected/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-kube-api-access-4dl2d\") pod \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.333413 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-inventory\") pod \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\" (UID: \"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb\") " Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.343846 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-ceph" (OuterVolumeSpecName: "ceph") pod "ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb" (UID: "ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.343867 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-kube-api-access-4dl2d" (OuterVolumeSpecName: "kube-api-access-4dl2d") pod "ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb" (UID: "ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb"). InnerVolumeSpecName "kube-api-access-4dl2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.354702 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb" (UID: "ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.372722 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb" (UID: "ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.374230 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-inventory" (OuterVolumeSpecName: "inventory") pod "ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb" (UID: "ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.435801 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.435832 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.435842 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.435850 4964 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.435860 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dl2d\" (UniqueName: \"kubernetes.io/projected/ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb-kube-api-access-4dl2d\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.737643 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" event={"ID":"ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb","Type":"ContainerDied","Data":"717afb2f10e0ddabf3d3ec55d9af492e173149985bcc629e7d51bbcf7d02de16"} Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.738025 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="717afb2f10e0ddabf3d3ec55d9af492e173149985bcc629e7d51bbcf7d02de16" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.737679 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.875581 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d"] Oct 04 03:15:18 crc kubenswrapper[4964]: E1004 03:15:18.876152 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" containerName="extract-utilities" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.876177 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" containerName="extract-utilities" Oct 04 03:15:18 crc kubenswrapper[4964]: E1004 03:15:18.876205 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.876219 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 04 03:15:18 crc kubenswrapper[4964]: E1004 03:15:18.876243 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" containerName="extract-content" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.876255 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" containerName="extract-content" Oct 04 03:15:18 crc kubenswrapper[4964]: E1004 03:15:18.876300 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" containerName="registry-server" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.876312 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" containerName="registry-server" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.876633 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.876671 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc074dbc-2e1f-43c8-a612-e69a8c29e1a2" containerName="registry-server" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.877585 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.880395 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.885113 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.885256 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.885507 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.885728 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.887806 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d"] Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.943839 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-27h7d\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.944103 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-27h7d\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.944189 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dml8k\" (UniqueName: \"kubernetes.io/projected/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-kube-api-access-dml8k\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-27h7d\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:18 crc kubenswrapper[4964]: I1004 03:15:18.944306 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-27h7d\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:19 crc kubenswrapper[4964]: I1004 03:15:19.047014 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-27h7d\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:19 crc kubenswrapper[4964]: I1004 03:15:19.047088 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dml8k\" (UniqueName: \"kubernetes.io/projected/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-kube-api-access-dml8k\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-27h7d\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:19 crc kubenswrapper[4964]: I1004 03:15:19.047145 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-27h7d\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:19 crc kubenswrapper[4964]: I1004 03:15:19.047380 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-27h7d\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:19 crc kubenswrapper[4964]: I1004 03:15:19.052823 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-27h7d\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:19 crc kubenswrapper[4964]: I1004 03:15:19.056360 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-27h7d\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:19 crc kubenswrapper[4964]: I1004 03:15:19.058568 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-27h7d\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:19 crc kubenswrapper[4964]: I1004 03:15:19.079383 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dml8k\" (UniqueName: \"kubernetes.io/projected/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-kube-api-access-dml8k\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-27h7d\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:19 crc kubenswrapper[4964]: I1004 03:15:19.198979 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:19 crc kubenswrapper[4964]: I1004 03:15:19.895511 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d"] Oct 04 03:15:19 crc kubenswrapper[4964]: W1004 03:15:19.902200 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40eeca54_e1ed_4769_b7cb_1cd5f7be08f3.slice/crio-850d1c62865a1410ed9d149e1883585540137e5402d1c1064c09b799d38c330f WatchSource:0}: Error finding container 850d1c62865a1410ed9d149e1883585540137e5402d1c1064c09b799d38c330f: Status 404 returned error can't find the container with id 850d1c62865a1410ed9d149e1883585540137e5402d1c1064c09b799d38c330f Oct 04 03:15:20 crc kubenswrapper[4964]: I1004 03:15:20.760785 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" event={"ID":"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3","Type":"ContainerStarted","Data":"223956f900a258fc1c8513bc8bb891c5b625063e40f854ad1b2d6f2a2b0c057b"} Oct 04 03:15:20 crc kubenswrapper[4964]: I1004 03:15:20.761063 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" event={"ID":"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3","Type":"ContainerStarted","Data":"850d1c62865a1410ed9d149e1883585540137e5402d1c1064c09b799d38c330f"} Oct 04 03:15:20 crc kubenswrapper[4964]: I1004 03:15:20.793815 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" podStartSLOduration=2.352352778 podStartE2EDuration="2.793790807s" podCreationTimestamp="2025-10-04 03:15:18 +0000 UTC" firstStartedPulling="2025-10-04 03:15:19.904205409 +0000 UTC m=+2099.801164057" lastFinishedPulling="2025-10-04 03:15:20.345643428 +0000 UTC m=+2100.242602086" observedRunningTime="2025-10-04 03:15:20.784594742 +0000 UTC m=+2100.681553380" watchObservedRunningTime="2025-10-04 03:15:20.793790807 +0000 UTC m=+2100.690749475" Oct 04 03:15:28 crc kubenswrapper[4964]: I1004 03:15:28.741684 4964 scope.go:117] "RemoveContainer" containerID="6246ea77b76fd9b65defb470a089720c73f6fd8e36721940e32bbef122d4f24c" Oct 04 03:15:28 crc kubenswrapper[4964]: I1004 03:15:28.788873 4964 scope.go:117] "RemoveContainer" containerID="a06cc299f9052b11b553a1e032f51ab5b8b257d86e053b9d70fe6eec68471cdb" Oct 04 03:15:48 crc kubenswrapper[4964]: I1004 03:15:48.054479 4964 generic.go:334] "Generic (PLEG): container finished" podID="40eeca54-e1ed-4769-b7cb-1cd5f7be08f3" containerID="223956f900a258fc1c8513bc8bb891c5b625063e40f854ad1b2d6f2a2b0c057b" exitCode=0 Oct 04 03:15:48 crc kubenswrapper[4964]: I1004 03:15:48.054589 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" event={"ID":"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3","Type":"ContainerDied","Data":"223956f900a258fc1c8513bc8bb891c5b625063e40f854ad1b2d6f2a2b0c057b"} Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.567759 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.703961 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-ceph\") pod \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.704354 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-inventory\") pod \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.704400 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dml8k\" (UniqueName: \"kubernetes.io/projected/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-kube-api-access-dml8k\") pod \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.704471 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-ssh-key\") pod \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\" (UID: \"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3\") " Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.712254 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-kube-api-access-dml8k" (OuterVolumeSpecName: "kube-api-access-dml8k") pod "40eeca54-e1ed-4769-b7cb-1cd5f7be08f3" (UID: "40eeca54-e1ed-4769-b7cb-1cd5f7be08f3"). InnerVolumeSpecName "kube-api-access-dml8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.713125 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-ceph" (OuterVolumeSpecName: "ceph") pod "40eeca54-e1ed-4769-b7cb-1cd5f7be08f3" (UID: "40eeca54-e1ed-4769-b7cb-1cd5f7be08f3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.757810 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-inventory" (OuterVolumeSpecName: "inventory") pod "40eeca54-e1ed-4769-b7cb-1cd5f7be08f3" (UID: "40eeca54-e1ed-4769-b7cb-1cd5f7be08f3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.763282 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "40eeca54-e1ed-4769-b7cb-1cd5f7be08f3" (UID: "40eeca54-e1ed-4769-b7cb-1cd5f7be08f3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.807005 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.807068 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.807089 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dml8k\" (UniqueName: \"kubernetes.io/projected/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-kube-api-access-dml8k\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:49 crc kubenswrapper[4964]: I1004 03:15:49.807106 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40eeca54-e1ed-4769-b7cb-1cd5f7be08f3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.081679 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" event={"ID":"40eeca54-e1ed-4769-b7cb-1cd5f7be08f3","Type":"ContainerDied","Data":"850d1c62865a1410ed9d149e1883585540137e5402d1c1064c09b799d38c330f"} Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.081715 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="850d1c62865a1410ed9d149e1883585540137e5402d1c1064c09b799d38c330f" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.081764 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-27h7d" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.191778 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct"] Oct 04 03:15:50 crc kubenswrapper[4964]: E1004 03:15:50.192514 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40eeca54-e1ed-4769-b7cb-1cd5f7be08f3" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.192543 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="40eeca54-e1ed-4769-b7cb-1cd5f7be08f3" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.192899 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="40eeca54-e1ed-4769-b7cb-1cd5f7be08f3" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.193892 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.195925 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.198478 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.198845 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.199023 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.201026 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.206172 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct"] Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.316181 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phjd5\" (UniqueName: \"kubernetes.io/projected/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-kube-api-access-phjd5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mwjct\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.316225 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mwjct\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.316257 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mwjct\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.316277 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mwjct\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.418039 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phjd5\" (UniqueName: \"kubernetes.io/projected/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-kube-api-access-phjd5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mwjct\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.418100 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mwjct\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.418139 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mwjct\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.418167 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mwjct\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.423301 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mwjct\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.424056 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mwjct\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.437965 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mwjct\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.448095 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phjd5\" (UniqueName: \"kubernetes.io/projected/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-kube-api-access-phjd5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mwjct\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.512388 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:50 crc kubenswrapper[4964]: I1004 03:15:50.801355 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct"] Oct 04 03:15:50 crc kubenswrapper[4964]: W1004 03:15:50.809049 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7c8421d_a33a_45b1_89f6_f2d57dc16ef6.slice/crio-047abd385a2ca85333ce1fef610beefb10061678730fccf6a57e183553afdd9e WatchSource:0}: Error finding container 047abd385a2ca85333ce1fef610beefb10061678730fccf6a57e183553afdd9e: Status 404 returned error can't find the container with id 047abd385a2ca85333ce1fef610beefb10061678730fccf6a57e183553afdd9e Oct 04 03:15:51 crc kubenswrapper[4964]: I1004 03:15:51.094058 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" event={"ID":"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6","Type":"ContainerStarted","Data":"047abd385a2ca85333ce1fef610beefb10061678730fccf6a57e183553afdd9e"} Oct 04 03:15:52 crc kubenswrapper[4964]: I1004 03:15:52.108664 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" event={"ID":"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6","Type":"ContainerStarted","Data":"42b8d584ab811441c9d28c1fbb2110d487e9a46e2e9c2192a259b3409ff25b4e"} Oct 04 03:15:52 crc kubenswrapper[4964]: I1004 03:15:52.144966 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" podStartSLOduration=1.717112727 podStartE2EDuration="2.144948366s" podCreationTimestamp="2025-10-04 03:15:50 +0000 UTC" firstStartedPulling="2025-10-04 03:15:50.810856005 +0000 UTC m=+2130.707814643" lastFinishedPulling="2025-10-04 03:15:51.238691614 +0000 UTC m=+2131.135650282" observedRunningTime="2025-10-04 03:15:52.138018921 +0000 UTC m=+2132.034977569" watchObservedRunningTime="2025-10-04 03:15:52.144948366 +0000 UTC m=+2132.041907004" Oct 04 03:15:57 crc kubenswrapper[4964]: I1004 03:15:57.160358 4964 generic.go:334] "Generic (PLEG): container finished" podID="d7c8421d-a33a-45b1-89f6-f2d57dc16ef6" containerID="42b8d584ab811441c9d28c1fbb2110d487e9a46e2e9c2192a259b3409ff25b4e" exitCode=0 Oct 04 03:15:57 crc kubenswrapper[4964]: I1004 03:15:57.160638 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" event={"ID":"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6","Type":"ContainerDied","Data":"42b8d584ab811441c9d28c1fbb2110d487e9a46e2e9c2192a259b3409ff25b4e"} Oct 04 03:15:58 crc kubenswrapper[4964]: I1004 03:15:58.742809 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:58 crc kubenswrapper[4964]: I1004 03:15:58.910023 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phjd5\" (UniqueName: \"kubernetes.io/projected/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-kube-api-access-phjd5\") pod \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " Oct 04 03:15:58 crc kubenswrapper[4964]: I1004 03:15:58.910122 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-inventory\") pod \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " Oct 04 03:15:58 crc kubenswrapper[4964]: I1004 03:15:58.910245 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-ceph\") pod \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " Oct 04 03:15:58 crc kubenswrapper[4964]: I1004 03:15:58.910352 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-ssh-key\") pod \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\" (UID: \"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6\") " Oct 04 03:15:58 crc kubenswrapper[4964]: I1004 03:15:58.917968 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-ceph" (OuterVolumeSpecName: "ceph") pod "d7c8421d-a33a-45b1-89f6-f2d57dc16ef6" (UID: "d7c8421d-a33a-45b1-89f6-f2d57dc16ef6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:15:58 crc kubenswrapper[4964]: I1004 03:15:58.918260 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-kube-api-access-phjd5" (OuterVolumeSpecName: "kube-api-access-phjd5") pod "d7c8421d-a33a-45b1-89f6-f2d57dc16ef6" (UID: "d7c8421d-a33a-45b1-89f6-f2d57dc16ef6"). InnerVolumeSpecName "kube-api-access-phjd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:15:58 crc kubenswrapper[4964]: I1004 03:15:58.961541 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-inventory" (OuterVolumeSpecName: "inventory") pod "d7c8421d-a33a-45b1-89f6-f2d57dc16ef6" (UID: "d7c8421d-a33a-45b1-89f6-f2d57dc16ef6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:15:58 crc kubenswrapper[4964]: I1004 03:15:58.964545 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d7c8421d-a33a-45b1-89f6-f2d57dc16ef6" (UID: "d7c8421d-a33a-45b1-89f6-f2d57dc16ef6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.013596 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.013672 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phjd5\" (UniqueName: \"kubernetes.io/projected/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-kube-api-access-phjd5\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.013695 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.013713 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7c8421d-a33a-45b1-89f6-f2d57dc16ef6-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.184234 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" event={"ID":"d7c8421d-a33a-45b1-89f6-f2d57dc16ef6","Type":"ContainerDied","Data":"047abd385a2ca85333ce1fef610beefb10061678730fccf6a57e183553afdd9e"} Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.184584 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="047abd385a2ca85333ce1fef610beefb10061678730fccf6a57e183553afdd9e" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.184332 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mwjct" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.273322 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn"] Oct 04 03:15:59 crc kubenswrapper[4964]: E1004 03:15:59.273830 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c8421d-a33a-45b1-89f6-f2d57dc16ef6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.273854 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c8421d-a33a-45b1-89f6-f2d57dc16ef6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.274281 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c8421d-a33a-45b1-89f6-f2d57dc16ef6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.275141 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.277400 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.277987 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.278198 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.279702 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.281120 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.291056 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn"] Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.421871 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nvqkn\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.421978 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nvqkn\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.422121 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nvqkn\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.422180 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpk95\" (UniqueName: \"kubernetes.io/projected/7243e183-f415-4ce5-9b98-6abb97122104-kube-api-access-kpk95\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nvqkn\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.523816 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nvqkn\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.524220 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nvqkn\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.524419 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nvqkn\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.524565 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpk95\" (UniqueName: \"kubernetes.io/projected/7243e183-f415-4ce5-9b98-6abb97122104-kube-api-access-kpk95\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nvqkn\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.531227 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nvqkn\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.533387 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nvqkn\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.533975 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nvqkn\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.546535 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpk95\" (UniqueName: \"kubernetes.io/projected/7243e183-f415-4ce5-9b98-6abb97122104-kube-api-access-kpk95\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nvqkn\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:15:59 crc kubenswrapper[4964]: I1004 03:15:59.601235 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:16:00 crc kubenswrapper[4964]: I1004 03:16:00.206141 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn"] Oct 04 03:16:01 crc kubenswrapper[4964]: I1004 03:16:01.205829 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" event={"ID":"7243e183-f415-4ce5-9b98-6abb97122104","Type":"ContainerStarted","Data":"a76ffb37376390ae5c6a270f2e5f6b509b1bb5cb0dd994d54cf14c65d61080cf"} Oct 04 03:16:01 crc kubenswrapper[4964]: I1004 03:16:01.206499 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" event={"ID":"7243e183-f415-4ce5-9b98-6abb97122104","Type":"ContainerStarted","Data":"2c73ea3c619d024e7734a662ad45a3c0ace54f9ce5a17159c9f483b511e276dd"} Oct 04 03:16:01 crc kubenswrapper[4964]: I1004 03:16:01.225773 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" podStartSLOduration=1.681834148 podStartE2EDuration="2.225740584s" podCreationTimestamp="2025-10-04 03:15:59 +0000 UTC" firstStartedPulling="2025-10-04 03:16:00.22049905 +0000 UTC m=+2140.117457728" lastFinishedPulling="2025-10-04 03:16:00.764405486 +0000 UTC m=+2140.661364164" observedRunningTime="2025-10-04 03:16:01.223481114 +0000 UTC m=+2141.120439752" watchObservedRunningTime="2025-10-04 03:16:01.225740584 +0000 UTC m=+2141.122699272" Oct 04 03:16:04 crc kubenswrapper[4964]: I1004 03:16:04.448872 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:16:04 crc kubenswrapper[4964]: I1004 03:16:04.449247 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:16:34 crc kubenswrapper[4964]: I1004 03:16:34.449587 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:16:34 crc kubenswrapper[4964]: I1004 03:16:34.450172 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:16:42 crc kubenswrapper[4964]: I1004 03:16:42.636940 4964 generic.go:334] "Generic (PLEG): container finished" podID="7243e183-f415-4ce5-9b98-6abb97122104" containerID="a76ffb37376390ae5c6a270f2e5f6b509b1bb5cb0dd994d54cf14c65d61080cf" exitCode=0 Oct 04 03:16:42 crc kubenswrapper[4964]: I1004 03:16:42.637027 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" event={"ID":"7243e183-f415-4ce5-9b98-6abb97122104","Type":"ContainerDied","Data":"a76ffb37376390ae5c6a270f2e5f6b509b1bb5cb0dd994d54cf14c65d61080cf"} Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.111878 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.212789 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-ceph\") pod \"7243e183-f415-4ce5-9b98-6abb97122104\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.212919 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpk95\" (UniqueName: \"kubernetes.io/projected/7243e183-f415-4ce5-9b98-6abb97122104-kube-api-access-kpk95\") pod \"7243e183-f415-4ce5-9b98-6abb97122104\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.213073 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-ssh-key\") pod \"7243e183-f415-4ce5-9b98-6abb97122104\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.213198 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-inventory\") pod \"7243e183-f415-4ce5-9b98-6abb97122104\" (UID: \"7243e183-f415-4ce5-9b98-6abb97122104\") " Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.222448 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-ceph" (OuterVolumeSpecName: "ceph") pod "7243e183-f415-4ce5-9b98-6abb97122104" (UID: "7243e183-f415-4ce5-9b98-6abb97122104"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.222526 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7243e183-f415-4ce5-9b98-6abb97122104-kube-api-access-kpk95" (OuterVolumeSpecName: "kube-api-access-kpk95") pod "7243e183-f415-4ce5-9b98-6abb97122104" (UID: "7243e183-f415-4ce5-9b98-6abb97122104"). InnerVolumeSpecName "kube-api-access-kpk95". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.251311 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-inventory" (OuterVolumeSpecName: "inventory") pod "7243e183-f415-4ce5-9b98-6abb97122104" (UID: "7243e183-f415-4ce5-9b98-6abb97122104"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.268672 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7243e183-f415-4ce5-9b98-6abb97122104" (UID: "7243e183-f415-4ce5-9b98-6abb97122104"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.316199 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.316479 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.316686 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpk95\" (UniqueName: \"kubernetes.io/projected/7243e183-f415-4ce5-9b98-6abb97122104-kube-api-access-kpk95\") on node \"crc\" DevicePath \"\"" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.316825 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7243e183-f415-4ce5-9b98-6abb97122104-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.665604 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.665754 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nvqkn" event={"ID":"7243e183-f415-4ce5-9b98-6abb97122104","Type":"ContainerDied","Data":"2c73ea3c619d024e7734a662ad45a3c0ace54f9ce5a17159c9f483b511e276dd"} Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.665848 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c73ea3c619d024e7734a662ad45a3c0ace54f9ce5a17159c9f483b511e276dd" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.804656 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm"] Oct 04 03:16:44 crc kubenswrapper[4964]: E1004 03:16:44.805183 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7243e183-f415-4ce5-9b98-6abb97122104" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.805213 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="7243e183-f415-4ce5-9b98-6abb97122104" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.805540 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="7243e183-f415-4ce5-9b98-6abb97122104" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.806509 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.808545 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.813170 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.813918 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.814295 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.814474 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.821143 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm"] Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.930924 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.932186 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.932284 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flhqs\" (UniqueName: \"kubernetes.io/projected/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-kube-api-access-flhqs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:44 crc kubenswrapper[4964]: I1004 03:16:44.932331 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:45 crc kubenswrapper[4964]: I1004 03:16:45.034792 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:45 crc kubenswrapper[4964]: I1004 03:16:45.035231 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:45 crc kubenswrapper[4964]: I1004 03:16:45.035536 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flhqs\" (UniqueName: \"kubernetes.io/projected/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-kube-api-access-flhqs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:45 crc kubenswrapper[4964]: I1004 03:16:45.035772 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:45 crc kubenswrapper[4964]: I1004 03:16:45.039263 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:45 crc kubenswrapper[4964]: I1004 03:16:45.042559 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:45 crc kubenswrapper[4964]: I1004 03:16:45.042611 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:45 crc kubenswrapper[4964]: I1004 03:16:45.066741 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flhqs\" (UniqueName: \"kubernetes.io/projected/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-kube-api-access-flhqs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:45 crc kubenswrapper[4964]: I1004 03:16:45.136382 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:45 crc kubenswrapper[4964]: I1004 03:16:45.543825 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm"] Oct 04 03:16:45 crc kubenswrapper[4964]: W1004 03:16:45.563683 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf8ff1f6_3d01_4248_b4eb_09415f50a4b1.slice/crio-f64c91431e7f4fd38ec419fa68015fd72df864276e0abeeb49c3661557f6fbf5 WatchSource:0}: Error finding container f64c91431e7f4fd38ec419fa68015fd72df864276e0abeeb49c3661557f6fbf5: Status 404 returned error can't find the container with id f64c91431e7f4fd38ec419fa68015fd72df864276e0abeeb49c3661557f6fbf5 Oct 04 03:16:45 crc kubenswrapper[4964]: I1004 03:16:45.676370 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" event={"ID":"af8ff1f6-3d01-4248-b4eb-09415f50a4b1","Type":"ContainerStarted","Data":"f64c91431e7f4fd38ec419fa68015fd72df864276e0abeeb49c3661557f6fbf5"} Oct 04 03:16:46 crc kubenswrapper[4964]: I1004 03:16:46.692057 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" event={"ID":"af8ff1f6-3d01-4248-b4eb-09415f50a4b1","Type":"ContainerStarted","Data":"4cd5064e6bc6f7502a50687533a8c0b5ba46f1b3ae76bb322bd447f3bd6cb89b"} Oct 04 03:16:46 crc kubenswrapper[4964]: I1004 03:16:46.725463 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" podStartSLOduration=2.2514367220000002 podStartE2EDuration="2.725434788s" podCreationTimestamp="2025-10-04 03:16:44 +0000 UTC" firstStartedPulling="2025-10-04 03:16:45.567506524 +0000 UTC m=+2185.464465162" lastFinishedPulling="2025-10-04 03:16:46.04150456 +0000 UTC m=+2185.938463228" observedRunningTime="2025-10-04 03:16:46.714539818 +0000 UTC m=+2186.611498486" watchObservedRunningTime="2025-10-04 03:16:46.725434788 +0000 UTC m=+2186.622393466" Oct 04 03:16:50 crc kubenswrapper[4964]: I1004 03:16:50.733738 4964 generic.go:334] "Generic (PLEG): container finished" podID="af8ff1f6-3d01-4248-b4eb-09415f50a4b1" containerID="4cd5064e6bc6f7502a50687533a8c0b5ba46f1b3ae76bb322bd447f3bd6cb89b" exitCode=0 Oct 04 03:16:50 crc kubenswrapper[4964]: I1004 03:16:50.733853 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" event={"ID":"af8ff1f6-3d01-4248-b4eb-09415f50a4b1","Type":"ContainerDied","Data":"4cd5064e6bc6f7502a50687533a8c0b5ba46f1b3ae76bb322bd447f3bd6cb89b"} Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.234969 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.383268 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-ssh-key\") pod \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.383322 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-ceph\") pod \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.383392 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flhqs\" (UniqueName: \"kubernetes.io/projected/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-kube-api-access-flhqs\") pod \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.383517 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-inventory\") pod \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\" (UID: \"af8ff1f6-3d01-4248-b4eb-09415f50a4b1\") " Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.393685 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-ceph" (OuterVolumeSpecName: "ceph") pod "af8ff1f6-3d01-4248-b4eb-09415f50a4b1" (UID: "af8ff1f6-3d01-4248-b4eb-09415f50a4b1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.393835 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-kube-api-access-flhqs" (OuterVolumeSpecName: "kube-api-access-flhqs") pod "af8ff1f6-3d01-4248-b4eb-09415f50a4b1" (UID: "af8ff1f6-3d01-4248-b4eb-09415f50a4b1"). InnerVolumeSpecName "kube-api-access-flhqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.429779 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "af8ff1f6-3d01-4248-b4eb-09415f50a4b1" (UID: "af8ff1f6-3d01-4248-b4eb-09415f50a4b1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.433695 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-inventory" (OuterVolumeSpecName: "inventory") pod "af8ff1f6-3d01-4248-b4eb-09415f50a4b1" (UID: "af8ff1f6-3d01-4248-b4eb-09415f50a4b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.486029 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.486093 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.486117 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.486146 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flhqs\" (UniqueName: \"kubernetes.io/projected/af8ff1f6-3d01-4248-b4eb-09415f50a4b1-kube-api-access-flhqs\") on node \"crc\" DevicePath \"\"" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.762827 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" event={"ID":"af8ff1f6-3d01-4248-b4eb-09415f50a4b1","Type":"ContainerDied","Data":"f64c91431e7f4fd38ec419fa68015fd72df864276e0abeeb49c3661557f6fbf5"} Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.762888 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f64c91431e7f4fd38ec419fa68015fd72df864276e0abeeb49c3661557f6fbf5" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.762952 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.869402 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck"] Oct 04 03:16:52 crc kubenswrapper[4964]: E1004 03:16:52.870193 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8ff1f6-3d01-4248-b4eb-09415f50a4b1" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.870334 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8ff1f6-3d01-4248-b4eb-09415f50a4b1" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.870856 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8ff1f6-3d01-4248-b4eb-09415f50a4b1" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.872005 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.875399 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.875460 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.875885 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.876107 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.876374 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.882956 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck"] Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.997162 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jtvb\" (UniqueName: \"kubernetes.io/projected/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-kube-api-access-5jtvb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wnlck\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.997294 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wnlck\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.997420 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wnlck\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:52 crc kubenswrapper[4964]: I1004 03:16:52.998380 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wnlck\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:53 crc kubenswrapper[4964]: I1004 03:16:53.101035 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jtvb\" (UniqueName: \"kubernetes.io/projected/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-kube-api-access-5jtvb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wnlck\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:53 crc kubenswrapper[4964]: I1004 03:16:53.101134 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wnlck\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:53 crc kubenswrapper[4964]: I1004 03:16:53.101217 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wnlck\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:53 crc kubenswrapper[4964]: I1004 03:16:53.101243 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wnlck\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:53 crc kubenswrapper[4964]: I1004 03:16:53.107670 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wnlck\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:53 crc kubenswrapper[4964]: I1004 03:16:53.107716 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wnlck\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:53 crc kubenswrapper[4964]: I1004 03:16:53.114506 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wnlck\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:53 crc kubenswrapper[4964]: I1004 03:16:53.123259 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jtvb\" (UniqueName: \"kubernetes.io/projected/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-kube-api-access-5jtvb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wnlck\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:53 crc kubenswrapper[4964]: I1004 03:16:53.226355 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:16:53 crc kubenswrapper[4964]: I1004 03:16:53.836967 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck"] Oct 04 03:16:54 crc kubenswrapper[4964]: I1004 03:16:54.793309 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" event={"ID":"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7","Type":"ContainerStarted","Data":"c3a3313890bf6dbec9a5a7ec9882b73ce91d6c49753734230e90e8e21229e431"} Oct 04 03:16:54 crc kubenswrapper[4964]: I1004 03:16:54.793776 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" event={"ID":"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7","Type":"ContainerStarted","Data":"079cfa4af7f0c30d3a211076b556e84608d871ae2b32bde75a0a593ab71f0490"} Oct 04 03:16:54 crc kubenswrapper[4964]: I1004 03:16:54.832754 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" podStartSLOduration=2.333230313 podStartE2EDuration="2.832729177s" podCreationTimestamp="2025-10-04 03:16:52 +0000 UTC" firstStartedPulling="2025-10-04 03:16:53.848583215 +0000 UTC m=+2193.745541883" lastFinishedPulling="2025-10-04 03:16:54.348082069 +0000 UTC m=+2194.245040747" observedRunningTime="2025-10-04 03:16:54.819830135 +0000 UTC m=+2194.716788783" watchObservedRunningTime="2025-10-04 03:16:54.832729177 +0000 UTC m=+2194.729687855" Oct 04 03:17:04 crc kubenswrapper[4964]: I1004 03:17:04.448889 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:17:04 crc kubenswrapper[4964]: I1004 03:17:04.449641 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:17:04 crc kubenswrapper[4964]: I1004 03:17:04.449710 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 03:17:04 crc kubenswrapper[4964]: I1004 03:17:04.450441 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 03:17:04 crc kubenswrapper[4964]: I1004 03:17:04.450539 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" gracePeriod=600 Oct 04 03:17:04 crc kubenswrapper[4964]: E1004 03:17:04.571593 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:17:04 crc kubenswrapper[4964]: I1004 03:17:04.897349 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" exitCode=0 Oct 04 03:17:04 crc kubenswrapper[4964]: I1004 03:17:04.897443 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844"} Oct 04 03:17:04 crc kubenswrapper[4964]: I1004 03:17:04.897784 4964 scope.go:117] "RemoveContainer" containerID="808711bbb9b5290bb75b5f959551d10a31c81e759a35e4ab4abfe0ca4eaf4a72" Oct 04 03:17:04 crc kubenswrapper[4964]: I1004 03:17:04.898437 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:17:04 crc kubenswrapper[4964]: E1004 03:17:04.898800 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:17:18 crc kubenswrapper[4964]: I1004 03:17:18.846383 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:17:18 crc kubenswrapper[4964]: E1004 03:17:18.847659 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:17:33 crc kubenswrapper[4964]: I1004 03:17:33.845806 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:17:33 crc kubenswrapper[4964]: E1004 03:17:33.846910 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:17:44 crc kubenswrapper[4964]: I1004 03:17:44.335031 4964 generic.go:334] "Generic (PLEG): container finished" podID="31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7" containerID="c3a3313890bf6dbec9a5a7ec9882b73ce91d6c49753734230e90e8e21229e431" exitCode=0 Oct 04 03:17:44 crc kubenswrapper[4964]: I1004 03:17:44.335160 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" event={"ID":"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7","Type":"ContainerDied","Data":"c3a3313890bf6dbec9a5a7ec9882b73ce91d6c49753734230e90e8e21229e431"} Oct 04 03:17:45 crc kubenswrapper[4964]: I1004 03:17:45.909567 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:17:45 crc kubenswrapper[4964]: I1004 03:17:45.989073 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-inventory\") pod \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " Oct 04 03:17:45 crc kubenswrapper[4964]: I1004 03:17:45.989171 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-ceph\") pod \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " Oct 04 03:17:45 crc kubenswrapper[4964]: I1004 03:17:45.989286 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-ssh-key\") pod \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " Oct 04 03:17:45 crc kubenswrapper[4964]: I1004 03:17:45.989418 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jtvb\" (UniqueName: \"kubernetes.io/projected/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-kube-api-access-5jtvb\") pod \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\" (UID: \"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7\") " Oct 04 03:17:45 crc kubenswrapper[4964]: I1004 03:17:45.998702 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-ceph" (OuterVolumeSpecName: "ceph") pod "31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7" (UID: "31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:17:45 crc kubenswrapper[4964]: I1004 03:17:45.998836 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-kube-api-access-5jtvb" (OuterVolumeSpecName: "kube-api-access-5jtvb") pod "31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7" (UID: "31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7"). InnerVolumeSpecName "kube-api-access-5jtvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.036064 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-inventory" (OuterVolumeSpecName: "inventory") pod "31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7" (UID: "31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.041507 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7" (UID: "31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.094100 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.094175 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jtvb\" (UniqueName: \"kubernetes.io/projected/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-kube-api-access-5jtvb\") on node \"crc\" DevicePath \"\"" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.094198 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.094217 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.355139 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" event={"ID":"31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7","Type":"ContainerDied","Data":"079cfa4af7f0c30d3a211076b556e84608d871ae2b32bde75a0a593ab71f0490"} Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.355181 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="079cfa4af7f0c30d3a211076b556e84608d871ae2b32bde75a0a593ab71f0490" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.355243 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wnlck" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.448772 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8xppt"] Oct 04 03:17:46 crc kubenswrapper[4964]: E1004 03:17:46.449226 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.449252 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.449486 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.450199 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.453558 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.453700 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.453787 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.453834 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.454376 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.476348 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8xppt"] Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.501570 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8xppt\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.501740 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8xppt\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.501793 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqjhv\" (UniqueName: \"kubernetes.io/projected/591298c0-563f-43b7-9291-9463885f7c6c-kube-api-access-hqjhv\") pod \"ssh-known-hosts-edpm-deployment-8xppt\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.501925 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-ceph\") pod \"ssh-known-hosts-edpm-deployment-8xppt\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.604076 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-ceph\") pod \"ssh-known-hosts-edpm-deployment-8xppt\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.604182 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8xppt\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.604233 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8xppt\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.604273 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqjhv\" (UniqueName: \"kubernetes.io/projected/591298c0-563f-43b7-9291-9463885f7c6c-kube-api-access-hqjhv\") pod \"ssh-known-hosts-edpm-deployment-8xppt\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.608635 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8xppt\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.608767 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-ceph\") pod \"ssh-known-hosts-edpm-deployment-8xppt\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.609053 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8xppt\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.626462 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqjhv\" (UniqueName: \"kubernetes.io/projected/591298c0-563f-43b7-9291-9463885f7c6c-kube-api-access-hqjhv\") pod \"ssh-known-hosts-edpm-deployment-8xppt\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.792164 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:17:46 crc kubenswrapper[4964]: I1004 03:17:46.846559 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:17:46 crc kubenswrapper[4964]: E1004 03:17:46.847155 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:17:47 crc kubenswrapper[4964]: I1004 03:17:47.443757 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8xppt"] Oct 04 03:17:47 crc kubenswrapper[4964]: W1004 03:17:47.446413 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod591298c0_563f_43b7_9291_9463885f7c6c.slice/crio-e830319c86c3f2d46ec2274a4a4a784c516ad45c9a22a72b338fbde12ae7a4f4 WatchSource:0}: Error finding container e830319c86c3f2d46ec2274a4a4a784c516ad45c9a22a72b338fbde12ae7a4f4: Status 404 returned error can't find the container with id e830319c86c3f2d46ec2274a4a4a784c516ad45c9a22a72b338fbde12ae7a4f4 Oct 04 03:17:48 crc kubenswrapper[4964]: I1004 03:17:48.387414 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" event={"ID":"591298c0-563f-43b7-9291-9463885f7c6c","Type":"ContainerStarted","Data":"9ecff0c8f09371ab87967834c9c3d41a43eb9411216714fafba5569ae010b10d"} Oct 04 03:17:48 crc kubenswrapper[4964]: I1004 03:17:48.387827 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" event={"ID":"591298c0-563f-43b7-9291-9463885f7c6c","Type":"ContainerStarted","Data":"e830319c86c3f2d46ec2274a4a4a784c516ad45c9a22a72b338fbde12ae7a4f4"} Oct 04 03:17:48 crc kubenswrapper[4964]: I1004 03:17:48.410133 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" podStartSLOduration=1.9588729919999999 podStartE2EDuration="2.410115003s" podCreationTimestamp="2025-10-04 03:17:46 +0000 UTC" firstStartedPulling="2025-10-04 03:17:47.448230552 +0000 UTC m=+2247.345189190" lastFinishedPulling="2025-10-04 03:17:47.899472523 +0000 UTC m=+2247.796431201" observedRunningTime="2025-10-04 03:17:48.405003278 +0000 UTC m=+2248.301961916" watchObservedRunningTime="2025-10-04 03:17:48.410115003 +0000 UTC m=+2248.307073641" Oct 04 03:17:58 crc kubenswrapper[4964]: I1004 03:17:58.498584 4964 generic.go:334] "Generic (PLEG): container finished" podID="591298c0-563f-43b7-9291-9463885f7c6c" containerID="9ecff0c8f09371ab87967834c9c3d41a43eb9411216714fafba5569ae010b10d" exitCode=0 Oct 04 03:17:58 crc kubenswrapper[4964]: I1004 03:17:58.498680 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" event={"ID":"591298c0-563f-43b7-9291-9463885f7c6c","Type":"ContainerDied","Data":"9ecff0c8f09371ab87967834c9c3d41a43eb9411216714fafba5569ae010b10d"} Oct 04 03:17:59 crc kubenswrapper[4964]: I1004 03:17:59.845959 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:17:59 crc kubenswrapper[4964]: E1004 03:17:59.846541 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:17:59 crc kubenswrapper[4964]: I1004 03:17:59.967755 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.076286 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-inventory-0\") pod \"591298c0-563f-43b7-9291-9463885f7c6c\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.076356 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-ssh-key-openstack-edpm-ipam\") pod \"591298c0-563f-43b7-9291-9463885f7c6c\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.076400 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-ceph\") pod \"591298c0-563f-43b7-9291-9463885f7c6c\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.076455 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqjhv\" (UniqueName: \"kubernetes.io/projected/591298c0-563f-43b7-9291-9463885f7c6c-kube-api-access-hqjhv\") pod \"591298c0-563f-43b7-9291-9463885f7c6c\" (UID: \"591298c0-563f-43b7-9291-9463885f7c6c\") " Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.082179 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-ceph" (OuterVolumeSpecName: "ceph") pod "591298c0-563f-43b7-9291-9463885f7c6c" (UID: "591298c0-563f-43b7-9291-9463885f7c6c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.082303 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591298c0-563f-43b7-9291-9463885f7c6c-kube-api-access-hqjhv" (OuterVolumeSpecName: "kube-api-access-hqjhv") pod "591298c0-563f-43b7-9291-9463885f7c6c" (UID: "591298c0-563f-43b7-9291-9463885f7c6c"). InnerVolumeSpecName "kube-api-access-hqjhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.105895 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "591298c0-563f-43b7-9291-9463885f7c6c" (UID: "591298c0-563f-43b7-9291-9463885f7c6c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.124319 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "591298c0-563f-43b7-9291-9463885f7c6c" (UID: "591298c0-563f-43b7-9291-9463885f7c6c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.177905 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.177955 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.177976 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqjhv\" (UniqueName: \"kubernetes.io/projected/591298c0-563f-43b7-9291-9463885f7c6c-kube-api-access-hqjhv\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.177996 4964 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/591298c0-563f-43b7-9291-9463885f7c6c-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.525968 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" event={"ID":"591298c0-563f-43b7-9291-9463885f7c6c","Type":"ContainerDied","Data":"e830319c86c3f2d46ec2274a4a4a784c516ad45c9a22a72b338fbde12ae7a4f4"} Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.526030 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e830319c86c3f2d46ec2274a4a4a784c516ad45c9a22a72b338fbde12ae7a4f4" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.526036 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8xppt" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.614021 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml"] Oct 04 03:18:00 crc kubenswrapper[4964]: E1004 03:18:00.614429 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591298c0-563f-43b7-9291-9463885f7c6c" containerName="ssh-known-hosts-edpm-deployment" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.614449 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="591298c0-563f-43b7-9291-9463885f7c6c" containerName="ssh-known-hosts-edpm-deployment" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.614688 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="591298c0-563f-43b7-9291-9463885f7c6c" containerName="ssh-known-hosts-edpm-deployment" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.615339 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.618268 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.618286 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.618780 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.619297 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.627970 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.633179 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml"] Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.687743 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjzml\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.688038 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjzml\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.688270 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfmbx\" (UniqueName: \"kubernetes.io/projected/13993097-e1a1-4f3a-8e38-eee10934a97d-kube-api-access-xfmbx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjzml\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.688472 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjzml\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.792389 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjzml\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.792479 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjzml\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.792677 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfmbx\" (UniqueName: \"kubernetes.io/projected/13993097-e1a1-4f3a-8e38-eee10934a97d-kube-api-access-xfmbx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjzml\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.792853 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjzml\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.797277 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjzml\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.798477 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjzml\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.799323 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjzml\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.810036 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfmbx\" (UniqueName: \"kubernetes.io/projected/13993097-e1a1-4f3a-8e38-eee10934a97d-kube-api-access-xfmbx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjzml\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:00 crc kubenswrapper[4964]: I1004 03:18:00.932109 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:01 crc kubenswrapper[4964]: I1004 03:18:01.571232 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml"] Oct 04 03:18:01 crc kubenswrapper[4964]: W1004 03:18:01.583392 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13993097_e1a1_4f3a_8e38_eee10934a97d.slice/crio-1764965166ecb252cc3fb62b6b93a294cfb9f07948c99b0788ca3c8deb38b919 WatchSource:0}: Error finding container 1764965166ecb252cc3fb62b6b93a294cfb9f07948c99b0788ca3c8deb38b919: Status 404 returned error can't find the container with id 1764965166ecb252cc3fb62b6b93a294cfb9f07948c99b0788ca3c8deb38b919 Oct 04 03:18:02 crc kubenswrapper[4964]: I1004 03:18:02.545063 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" event={"ID":"13993097-e1a1-4f3a-8e38-eee10934a97d","Type":"ContainerStarted","Data":"1764965166ecb252cc3fb62b6b93a294cfb9f07948c99b0788ca3c8deb38b919"} Oct 04 03:18:03 crc kubenswrapper[4964]: I1004 03:18:03.559322 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" event={"ID":"13993097-e1a1-4f3a-8e38-eee10934a97d","Type":"ContainerStarted","Data":"cabdc845f3682164cba6c81bc19e9d622c7416aad3a589a044a92a6244ea712a"} Oct 04 03:18:03 crc kubenswrapper[4964]: I1004 03:18:03.583219 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" podStartSLOduration=2.909654191 podStartE2EDuration="3.583182664s" podCreationTimestamp="2025-10-04 03:18:00 +0000 UTC" firstStartedPulling="2025-10-04 03:18:01.590009196 +0000 UTC m=+2261.486967874" lastFinishedPulling="2025-10-04 03:18:02.263537699 +0000 UTC m=+2262.160496347" observedRunningTime="2025-10-04 03:18:03.578811137 +0000 UTC m=+2263.475769815" watchObservedRunningTime="2025-10-04 03:18:03.583182664 +0000 UTC m=+2263.480141342" Oct 04 03:18:10 crc kubenswrapper[4964]: I1004 03:18:10.660023 4964 generic.go:334] "Generic (PLEG): container finished" podID="13993097-e1a1-4f3a-8e38-eee10934a97d" containerID="cabdc845f3682164cba6c81bc19e9d622c7416aad3a589a044a92a6244ea712a" exitCode=0 Oct 04 03:18:10 crc kubenswrapper[4964]: I1004 03:18:10.660589 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" event={"ID":"13993097-e1a1-4f3a-8e38-eee10934a97d","Type":"ContainerDied","Data":"cabdc845f3682164cba6c81bc19e9d622c7416aad3a589a044a92a6244ea712a"} Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.148264 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.320762 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-ssh-key\") pod \"13993097-e1a1-4f3a-8e38-eee10934a97d\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.320859 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-inventory\") pod \"13993097-e1a1-4f3a-8e38-eee10934a97d\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.320925 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-ceph\") pod \"13993097-e1a1-4f3a-8e38-eee10934a97d\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.321019 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfmbx\" (UniqueName: \"kubernetes.io/projected/13993097-e1a1-4f3a-8e38-eee10934a97d-kube-api-access-xfmbx\") pod \"13993097-e1a1-4f3a-8e38-eee10934a97d\" (UID: \"13993097-e1a1-4f3a-8e38-eee10934a97d\") " Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.326751 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-ceph" (OuterVolumeSpecName: "ceph") pod "13993097-e1a1-4f3a-8e38-eee10934a97d" (UID: "13993097-e1a1-4f3a-8e38-eee10934a97d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.327120 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13993097-e1a1-4f3a-8e38-eee10934a97d-kube-api-access-xfmbx" (OuterVolumeSpecName: "kube-api-access-xfmbx") pod "13993097-e1a1-4f3a-8e38-eee10934a97d" (UID: "13993097-e1a1-4f3a-8e38-eee10934a97d"). InnerVolumeSpecName "kube-api-access-xfmbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.367283 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-inventory" (OuterVolumeSpecName: "inventory") pod "13993097-e1a1-4f3a-8e38-eee10934a97d" (UID: "13993097-e1a1-4f3a-8e38-eee10934a97d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.368833 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "13993097-e1a1-4f3a-8e38-eee10934a97d" (UID: "13993097-e1a1-4f3a-8e38-eee10934a97d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.423745 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.423785 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.423799 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13993097-e1a1-4f3a-8e38-eee10934a97d-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.423813 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfmbx\" (UniqueName: \"kubernetes.io/projected/13993097-e1a1-4f3a-8e38-eee10934a97d-kube-api-access-xfmbx\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.686511 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" event={"ID":"13993097-e1a1-4f3a-8e38-eee10934a97d","Type":"ContainerDied","Data":"1764965166ecb252cc3fb62b6b93a294cfb9f07948c99b0788ca3c8deb38b919"} Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.686582 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1764965166ecb252cc3fb62b6b93a294cfb9f07948c99b0788ca3c8deb38b919" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.686605 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjzml" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.792594 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj"] Oct 04 03:18:12 crc kubenswrapper[4964]: E1004 03:18:12.793307 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13993097-e1a1-4f3a-8e38-eee10934a97d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.793403 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="13993097-e1a1-4f3a-8e38-eee10934a97d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.793708 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="13993097-e1a1-4f3a-8e38-eee10934a97d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.794472 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.796935 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.797231 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.797475 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.797841 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.798082 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.826331 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj"] Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.835242 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.835608 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46r6v\" (UniqueName: \"kubernetes.io/projected/aea639a4-f63d-46d5-abc4-d574ac966161-kube-api-access-46r6v\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.835746 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.836124 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.938807 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.939011 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46r6v\" (UniqueName: \"kubernetes.io/projected/aea639a4-f63d-46d5-abc4-d574ac966161-kube-api-access-46r6v\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.939063 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.939191 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.945094 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.946581 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.946601 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:12 crc kubenswrapper[4964]: I1004 03:18:12.960860 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46r6v\" (UniqueName: \"kubernetes.io/projected/aea639a4-f63d-46d5-abc4-d574ac966161-kube-api-access-46r6v\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:13 crc kubenswrapper[4964]: I1004 03:18:13.140194 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:13 crc kubenswrapper[4964]: I1004 03:18:13.741801 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj"] Oct 04 03:18:13 crc kubenswrapper[4964]: W1004 03:18:13.747450 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaea639a4_f63d_46d5_abc4_d574ac966161.slice/crio-b9d8254e29d3fecd312784ef001aca55ca9cf6d5c6bf6e334a9d4dc5066d887f WatchSource:0}: Error finding container b9d8254e29d3fecd312784ef001aca55ca9cf6d5c6bf6e334a9d4dc5066d887f: Status 404 returned error can't find the container with id b9d8254e29d3fecd312784ef001aca55ca9cf6d5c6bf6e334a9d4dc5066d887f Oct 04 03:18:14 crc kubenswrapper[4964]: I1004 03:18:14.710279 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" event={"ID":"aea639a4-f63d-46d5-abc4-d574ac966161","Type":"ContainerStarted","Data":"5628a5baa3c3582a985f1868eca33b1964ca9303d387ff96027d29a613adcb4b"} Oct 04 03:18:14 crc kubenswrapper[4964]: I1004 03:18:14.710612 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" event={"ID":"aea639a4-f63d-46d5-abc4-d574ac966161","Type":"ContainerStarted","Data":"b9d8254e29d3fecd312784ef001aca55ca9cf6d5c6bf6e334a9d4dc5066d887f"} Oct 04 03:18:14 crc kubenswrapper[4964]: I1004 03:18:14.758749 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" podStartSLOduration=2.198429163 podStartE2EDuration="2.758724993s" podCreationTimestamp="2025-10-04 03:18:12 +0000 UTC" firstStartedPulling="2025-10-04 03:18:13.750035893 +0000 UTC m=+2273.646994541" lastFinishedPulling="2025-10-04 03:18:14.310331733 +0000 UTC m=+2274.207290371" observedRunningTime="2025-10-04 03:18:14.730832663 +0000 UTC m=+2274.627791331" watchObservedRunningTime="2025-10-04 03:18:14.758724993 +0000 UTC m=+2274.655683641" Oct 04 03:18:14 crc kubenswrapper[4964]: I1004 03:18:14.845815 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:18:14 crc kubenswrapper[4964]: E1004 03:18:14.846187 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:18:15 crc kubenswrapper[4964]: I1004 03:18:15.724432 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pxkwr"] Oct 04 03:18:15 crc kubenswrapper[4964]: I1004 03:18:15.728320 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:15 crc kubenswrapper[4964]: I1004 03:18:15.746729 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pxkwr"] Oct 04 03:18:15 crc kubenswrapper[4964]: I1004 03:18:15.902052 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c31ba20-4f86-409e-815f-77ff351c1e39-utilities\") pod \"redhat-operators-pxkwr\" (UID: \"8c31ba20-4f86-409e-815f-77ff351c1e39\") " pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:15 crc kubenswrapper[4964]: I1004 03:18:15.902158 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c31ba20-4f86-409e-815f-77ff351c1e39-catalog-content\") pod \"redhat-operators-pxkwr\" (UID: \"8c31ba20-4f86-409e-815f-77ff351c1e39\") " pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:15 crc kubenswrapper[4964]: I1004 03:18:15.902198 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpv6q\" (UniqueName: \"kubernetes.io/projected/8c31ba20-4f86-409e-815f-77ff351c1e39-kube-api-access-vpv6q\") pod \"redhat-operators-pxkwr\" (UID: \"8c31ba20-4f86-409e-815f-77ff351c1e39\") " pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:16 crc kubenswrapper[4964]: I1004 03:18:16.003784 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c31ba20-4f86-409e-815f-77ff351c1e39-catalog-content\") pod \"redhat-operators-pxkwr\" (UID: \"8c31ba20-4f86-409e-815f-77ff351c1e39\") " pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:16 crc kubenswrapper[4964]: I1004 03:18:16.003843 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpv6q\" (UniqueName: \"kubernetes.io/projected/8c31ba20-4f86-409e-815f-77ff351c1e39-kube-api-access-vpv6q\") pod \"redhat-operators-pxkwr\" (UID: \"8c31ba20-4f86-409e-815f-77ff351c1e39\") " pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:16 crc kubenswrapper[4964]: I1004 03:18:16.004016 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c31ba20-4f86-409e-815f-77ff351c1e39-utilities\") pod \"redhat-operators-pxkwr\" (UID: \"8c31ba20-4f86-409e-815f-77ff351c1e39\") " pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:16 crc kubenswrapper[4964]: I1004 03:18:16.004715 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c31ba20-4f86-409e-815f-77ff351c1e39-utilities\") pod \"redhat-operators-pxkwr\" (UID: \"8c31ba20-4f86-409e-815f-77ff351c1e39\") " pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:16 crc kubenswrapper[4964]: I1004 03:18:16.004768 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c31ba20-4f86-409e-815f-77ff351c1e39-catalog-content\") pod \"redhat-operators-pxkwr\" (UID: \"8c31ba20-4f86-409e-815f-77ff351c1e39\") " pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:16 crc kubenswrapper[4964]: I1004 03:18:16.022991 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpv6q\" (UniqueName: \"kubernetes.io/projected/8c31ba20-4f86-409e-815f-77ff351c1e39-kube-api-access-vpv6q\") pod \"redhat-operators-pxkwr\" (UID: \"8c31ba20-4f86-409e-815f-77ff351c1e39\") " pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:16 crc kubenswrapper[4964]: I1004 03:18:16.066221 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:16 crc kubenswrapper[4964]: I1004 03:18:16.512460 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pxkwr"] Oct 04 03:18:16 crc kubenswrapper[4964]: I1004 03:18:16.732265 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxkwr" event={"ID":"8c31ba20-4f86-409e-815f-77ff351c1e39","Type":"ContainerStarted","Data":"d5e1db92028a9073629869cf7ffcb7afcd786c526fd5e2519e3acda9b228501f"} Oct 04 03:18:16 crc kubenswrapper[4964]: I1004 03:18:16.732565 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxkwr" event={"ID":"8c31ba20-4f86-409e-815f-77ff351c1e39","Type":"ContainerStarted","Data":"0dcc1523984035b7175a7aa29f70bd23745bd99b5b0bad7ce738f5b1b397bc65"} Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.525112 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6pg5r"] Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.528089 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.546414 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6pg5r"] Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.631551 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d48a73-1170-482c-9c31-8e9b0e5d4096-utilities\") pod \"community-operators-6pg5r\" (UID: \"11d48a73-1170-482c-9c31-8e9b0e5d4096\") " pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.632010 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d48a73-1170-482c-9c31-8e9b0e5d4096-catalog-content\") pod \"community-operators-6pg5r\" (UID: \"11d48a73-1170-482c-9c31-8e9b0e5d4096\") " pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.632072 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzb8r\" (UniqueName: \"kubernetes.io/projected/11d48a73-1170-482c-9c31-8e9b0e5d4096-kube-api-access-fzb8r\") pod \"community-operators-6pg5r\" (UID: \"11d48a73-1170-482c-9c31-8e9b0e5d4096\") " pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.733955 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d48a73-1170-482c-9c31-8e9b0e5d4096-catalog-content\") pod \"community-operators-6pg5r\" (UID: \"11d48a73-1170-482c-9c31-8e9b0e5d4096\") " pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.734020 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzb8r\" (UniqueName: \"kubernetes.io/projected/11d48a73-1170-482c-9c31-8e9b0e5d4096-kube-api-access-fzb8r\") pod \"community-operators-6pg5r\" (UID: \"11d48a73-1170-482c-9c31-8e9b0e5d4096\") " pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.734072 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d48a73-1170-482c-9c31-8e9b0e5d4096-utilities\") pod \"community-operators-6pg5r\" (UID: \"11d48a73-1170-482c-9c31-8e9b0e5d4096\") " pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.734798 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d48a73-1170-482c-9c31-8e9b0e5d4096-catalog-content\") pod \"community-operators-6pg5r\" (UID: \"11d48a73-1170-482c-9c31-8e9b0e5d4096\") " pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.734866 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d48a73-1170-482c-9c31-8e9b0e5d4096-utilities\") pod \"community-operators-6pg5r\" (UID: \"11d48a73-1170-482c-9c31-8e9b0e5d4096\") " pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.742310 4964 generic.go:334] "Generic (PLEG): container finished" podID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerID="d5e1db92028a9073629869cf7ffcb7afcd786c526fd5e2519e3acda9b228501f" exitCode=0 Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.742349 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxkwr" event={"ID":"8c31ba20-4f86-409e-815f-77ff351c1e39","Type":"ContainerDied","Data":"d5e1db92028a9073629869cf7ffcb7afcd786c526fd5e2519e3acda9b228501f"} Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.754498 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzb8r\" (UniqueName: \"kubernetes.io/projected/11d48a73-1170-482c-9c31-8e9b0e5d4096-kube-api-access-fzb8r\") pod \"community-operators-6pg5r\" (UID: \"11d48a73-1170-482c-9c31-8e9b0e5d4096\") " pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:17 crc kubenswrapper[4964]: I1004 03:18:17.856495 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:18 crc kubenswrapper[4964]: I1004 03:18:18.403124 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6pg5r"] Oct 04 03:18:18 crc kubenswrapper[4964]: W1004 03:18:18.407421 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11d48a73_1170_482c_9c31_8e9b0e5d4096.slice/crio-ff9aa59ec88452e2e2271c15a4d046251f873d45f6aff3176241b770476b390f WatchSource:0}: Error finding container ff9aa59ec88452e2e2271c15a4d046251f873d45f6aff3176241b770476b390f: Status 404 returned error can't find the container with id ff9aa59ec88452e2e2271c15a4d046251f873d45f6aff3176241b770476b390f Oct 04 03:18:18 crc kubenswrapper[4964]: I1004 03:18:18.754659 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxkwr" event={"ID":"8c31ba20-4f86-409e-815f-77ff351c1e39","Type":"ContainerStarted","Data":"6450cd990d1a4ba34274ce8a86e357fccd8dad43695c8a02d1d2467ba1039edc"} Oct 04 03:18:18 crc kubenswrapper[4964]: I1004 03:18:18.758862 4964 generic.go:334] "Generic (PLEG): container finished" podID="11d48a73-1170-482c-9c31-8e9b0e5d4096" containerID="fbdf5f4f82fae0a3027ff797ff654759ed4e9cc9c726215f5ca0ce43a1063922" exitCode=0 Oct 04 03:18:18 crc kubenswrapper[4964]: I1004 03:18:18.758928 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg5r" event={"ID":"11d48a73-1170-482c-9c31-8e9b0e5d4096","Type":"ContainerDied","Data":"fbdf5f4f82fae0a3027ff797ff654759ed4e9cc9c726215f5ca0ce43a1063922"} Oct 04 03:18:18 crc kubenswrapper[4964]: I1004 03:18:18.758969 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg5r" event={"ID":"11d48a73-1170-482c-9c31-8e9b0e5d4096","Type":"ContainerStarted","Data":"ff9aa59ec88452e2e2271c15a4d046251f873d45f6aff3176241b770476b390f"} Oct 04 03:18:20 crc kubenswrapper[4964]: I1004 03:18:20.787281 4964 generic.go:334] "Generic (PLEG): container finished" podID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerID="6450cd990d1a4ba34274ce8a86e357fccd8dad43695c8a02d1d2467ba1039edc" exitCode=0 Oct 04 03:18:20 crc kubenswrapper[4964]: I1004 03:18:20.787381 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxkwr" event={"ID":"8c31ba20-4f86-409e-815f-77ff351c1e39","Type":"ContainerDied","Data":"6450cd990d1a4ba34274ce8a86e357fccd8dad43695c8a02d1d2467ba1039edc"} Oct 04 03:18:20 crc kubenswrapper[4964]: I1004 03:18:20.790990 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg5r" event={"ID":"11d48a73-1170-482c-9c31-8e9b0e5d4096","Type":"ContainerStarted","Data":"632f6ff1aa17bd0a4ac75b8ffadbd1bb17e6de55264ce8667b910c87b2c3c748"} Oct 04 03:18:21 crc kubenswrapper[4964]: I1004 03:18:21.831843 4964 generic.go:334] "Generic (PLEG): container finished" podID="11d48a73-1170-482c-9c31-8e9b0e5d4096" containerID="632f6ff1aa17bd0a4ac75b8ffadbd1bb17e6de55264ce8667b910c87b2c3c748" exitCode=0 Oct 04 03:18:21 crc kubenswrapper[4964]: I1004 03:18:21.831972 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg5r" event={"ID":"11d48a73-1170-482c-9c31-8e9b0e5d4096","Type":"ContainerDied","Data":"632f6ff1aa17bd0a4ac75b8ffadbd1bb17e6de55264ce8667b910c87b2c3c748"} Oct 04 03:18:22 crc kubenswrapper[4964]: I1004 03:18:22.843008 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxkwr" event={"ID":"8c31ba20-4f86-409e-815f-77ff351c1e39","Type":"ContainerStarted","Data":"17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7"} Oct 04 03:18:22 crc kubenswrapper[4964]: I1004 03:18:22.854471 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg5r" event={"ID":"11d48a73-1170-482c-9c31-8e9b0e5d4096","Type":"ContainerStarted","Data":"4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8"} Oct 04 03:18:22 crc kubenswrapper[4964]: I1004 03:18:22.870236 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pxkwr" podStartSLOduration=2.730251748 podStartE2EDuration="7.870218713s" podCreationTimestamp="2025-10-04 03:18:15 +0000 UTC" firstStartedPulling="2025-10-04 03:18:16.734141241 +0000 UTC m=+2276.631099879" lastFinishedPulling="2025-10-04 03:18:21.874108196 +0000 UTC m=+2281.771066844" observedRunningTime="2025-10-04 03:18:22.863427572 +0000 UTC m=+2282.760386250" watchObservedRunningTime="2025-10-04 03:18:22.870218713 +0000 UTC m=+2282.767177361" Oct 04 03:18:22 crc kubenswrapper[4964]: I1004 03:18:22.901356 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6pg5r" podStartSLOduration=2.3490734509999998 podStartE2EDuration="5.901325008s" podCreationTimestamp="2025-10-04 03:18:17 +0000 UTC" firstStartedPulling="2025-10-04 03:18:18.762415481 +0000 UTC m=+2278.659374159" lastFinishedPulling="2025-10-04 03:18:22.314667048 +0000 UTC m=+2282.211625716" observedRunningTime="2025-10-04 03:18:22.891351983 +0000 UTC m=+2282.788310671" watchObservedRunningTime="2025-10-04 03:18:22.901325008 +0000 UTC m=+2282.798283686" Oct 04 03:18:24 crc kubenswrapper[4964]: I1004 03:18:24.866037 4964 generic.go:334] "Generic (PLEG): container finished" podID="aea639a4-f63d-46d5-abc4-d574ac966161" containerID="5628a5baa3c3582a985f1868eca33b1964ca9303d387ff96027d29a613adcb4b" exitCode=0 Oct 04 03:18:24 crc kubenswrapper[4964]: I1004 03:18:24.866156 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" event={"ID":"aea639a4-f63d-46d5-abc4-d574ac966161","Type":"ContainerDied","Data":"5628a5baa3c3582a985f1868eca33b1964ca9303d387ff96027d29a613adcb4b"} Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.066319 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.066570 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.332035 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.349339 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-ceph\") pod \"aea639a4-f63d-46d5-abc4-d574ac966161\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.349453 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-ssh-key\") pod \"aea639a4-f63d-46d5-abc4-d574ac966161\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.349515 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-inventory\") pod \"aea639a4-f63d-46d5-abc4-d574ac966161\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.349541 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46r6v\" (UniqueName: \"kubernetes.io/projected/aea639a4-f63d-46d5-abc4-d574ac966161-kube-api-access-46r6v\") pod \"aea639a4-f63d-46d5-abc4-d574ac966161\" (UID: \"aea639a4-f63d-46d5-abc4-d574ac966161\") " Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.358282 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea639a4-f63d-46d5-abc4-d574ac966161-kube-api-access-46r6v" (OuterVolumeSpecName: "kube-api-access-46r6v") pod "aea639a4-f63d-46d5-abc4-d574ac966161" (UID: "aea639a4-f63d-46d5-abc4-d574ac966161"). InnerVolumeSpecName "kube-api-access-46r6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.379223 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-ceph" (OuterVolumeSpecName: "ceph") pod "aea639a4-f63d-46d5-abc4-d574ac966161" (UID: "aea639a4-f63d-46d5-abc4-d574ac966161"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.411002 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-inventory" (OuterVolumeSpecName: "inventory") pod "aea639a4-f63d-46d5-abc4-d574ac966161" (UID: "aea639a4-f63d-46d5-abc4-d574ac966161"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.412684 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aea639a4-f63d-46d5-abc4-d574ac966161" (UID: "aea639a4-f63d-46d5-abc4-d574ac966161"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.451571 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.451711 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.451733 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aea639a4-f63d-46d5-abc4-d574ac966161-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.451754 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46r6v\" (UniqueName: \"kubernetes.io/projected/aea639a4-f63d-46d5-abc4-d574ac966161-kube-api-access-46r6v\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.901075 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" event={"ID":"aea639a4-f63d-46d5-abc4-d574ac966161","Type":"ContainerDied","Data":"b9d8254e29d3fecd312784ef001aca55ca9cf6d5c6bf6e334a9d4dc5066d887f"} Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.901113 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.901132 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9d8254e29d3fecd312784ef001aca55ca9cf6d5c6bf6e334a9d4dc5066d887f" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.989461 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf"] Oct 04 03:18:26 crc kubenswrapper[4964]: E1004 03:18:26.990041 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea639a4-f63d-46d5-abc4-d574ac966161" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.990063 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea639a4-f63d-46d5-abc4-d574ac966161" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.990384 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea639a4-f63d-46d5-abc4-d574ac966161" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 04 03:18:26 crc kubenswrapper[4964]: I1004 03:18:26.991374 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.000005 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.000278 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.000365 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.000278 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.000562 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.001531 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.002214 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.002991 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf"] Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.005211 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.130267 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pxkwr" podUID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerName="registry-server" probeResult="failure" output=< Oct 04 03:18:27 crc kubenswrapper[4964]: timeout: failed to connect service ":50051" within 1s Oct 04 03:18:27 crc kubenswrapper[4964]: > Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.163083 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.163548 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.163659 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.163713 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.163894 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.164017 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smrjc\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-kube-api-access-smrjc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.164046 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.164068 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.164142 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.164173 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.164363 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.164466 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.164491 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.266399 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.266560 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.266605 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.266674 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.266756 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.266866 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smrjc\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-kube-api-access-smrjc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.267052 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.267567 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.267713 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.267782 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.267914 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.268041 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.268083 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.273060 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.273385 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.273538 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.274339 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.274974 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.275289 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.275457 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.275535 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.275990 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.276454 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.278231 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.279915 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.291322 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smrjc\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-kube-api-access-smrjc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.325099 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.857307 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.857766 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.864088 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf"] Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.868009 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.909508 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" event={"ID":"37d65c2b-caa2-47f0-917c-059ff235e83d","Type":"ContainerStarted","Data":"b0fe68bab74ef8108be956e4e7e4ce3a3a4e7711b48d9b665d0ddba305e30d0d"} Oct 04 03:18:27 crc kubenswrapper[4964]: I1004 03:18:27.917808 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:28 crc kubenswrapper[4964]: I1004 03:18:28.921770 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" event={"ID":"37d65c2b-caa2-47f0-917c-059ff235e83d","Type":"ContainerStarted","Data":"de8e63c3c9255fc9b3b07c766b8b6c8f4c1cf8502e6b05be3a598dc758b08be2"} Oct 04 03:18:28 crc kubenswrapper[4964]: I1004 03:18:28.963445 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" podStartSLOduration=2.471246913 podStartE2EDuration="2.963423195s" podCreationTimestamp="2025-10-04 03:18:26 +0000 UTC" firstStartedPulling="2025-10-04 03:18:27.867813359 +0000 UTC m=+2287.764771997" lastFinishedPulling="2025-10-04 03:18:28.359989621 +0000 UTC m=+2288.256948279" observedRunningTime="2025-10-04 03:18:28.94515993 +0000 UTC m=+2288.842118618" watchObservedRunningTime="2025-10-04 03:18:28.963423195 +0000 UTC m=+2288.860381833" Oct 04 03:18:29 crc kubenswrapper[4964]: I1004 03:18:29.004956 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:29 crc kubenswrapper[4964]: I1004 03:18:29.057414 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6pg5r"] Oct 04 03:18:29 crc kubenswrapper[4964]: I1004 03:18:29.845821 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:18:29 crc kubenswrapper[4964]: E1004 03:18:29.846096 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:18:30 crc kubenswrapper[4964]: I1004 03:18:30.956611 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6pg5r" podUID="11d48a73-1170-482c-9c31-8e9b0e5d4096" containerName="registry-server" containerID="cri-o://4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8" gracePeriod=2 Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.511843 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.651070 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d48a73-1170-482c-9c31-8e9b0e5d4096-catalog-content\") pod \"11d48a73-1170-482c-9c31-8e9b0e5d4096\" (UID: \"11d48a73-1170-482c-9c31-8e9b0e5d4096\") " Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.651196 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d48a73-1170-482c-9c31-8e9b0e5d4096-utilities\") pod \"11d48a73-1170-482c-9c31-8e9b0e5d4096\" (UID: \"11d48a73-1170-482c-9c31-8e9b0e5d4096\") " Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.651245 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzb8r\" (UniqueName: \"kubernetes.io/projected/11d48a73-1170-482c-9c31-8e9b0e5d4096-kube-api-access-fzb8r\") pod \"11d48a73-1170-482c-9c31-8e9b0e5d4096\" (UID: \"11d48a73-1170-482c-9c31-8e9b0e5d4096\") " Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.652048 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11d48a73-1170-482c-9c31-8e9b0e5d4096-utilities" (OuterVolumeSpecName: "utilities") pod "11d48a73-1170-482c-9c31-8e9b0e5d4096" (UID: "11d48a73-1170-482c-9c31-8e9b0e5d4096"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.660588 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d48a73-1170-482c-9c31-8e9b0e5d4096-kube-api-access-fzb8r" (OuterVolumeSpecName: "kube-api-access-fzb8r") pod "11d48a73-1170-482c-9c31-8e9b0e5d4096" (UID: "11d48a73-1170-482c-9c31-8e9b0e5d4096"). InnerVolumeSpecName "kube-api-access-fzb8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.715012 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11d48a73-1170-482c-9c31-8e9b0e5d4096-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11d48a73-1170-482c-9c31-8e9b0e5d4096" (UID: "11d48a73-1170-482c-9c31-8e9b0e5d4096"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.754751 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d48a73-1170-482c-9c31-8e9b0e5d4096-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.754806 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d48a73-1170-482c-9c31-8e9b0e5d4096-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.754827 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzb8r\" (UniqueName: \"kubernetes.io/projected/11d48a73-1170-482c-9c31-8e9b0e5d4096-kube-api-access-fzb8r\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.966888 4964 generic.go:334] "Generic (PLEG): container finished" podID="11d48a73-1170-482c-9c31-8e9b0e5d4096" containerID="4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8" exitCode=0 Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.966936 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg5r" event={"ID":"11d48a73-1170-482c-9c31-8e9b0e5d4096","Type":"ContainerDied","Data":"4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8"} Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.966955 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6pg5r" Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.966974 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg5r" event={"ID":"11d48a73-1170-482c-9c31-8e9b0e5d4096","Type":"ContainerDied","Data":"ff9aa59ec88452e2e2271c15a4d046251f873d45f6aff3176241b770476b390f"} Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.966997 4964 scope.go:117] "RemoveContainer" containerID="4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8" Oct 04 03:18:31 crc kubenswrapper[4964]: I1004 03:18:31.992982 4964 scope.go:117] "RemoveContainer" containerID="632f6ff1aa17bd0a4ac75b8ffadbd1bb17e6de55264ce8667b910c87b2c3c748" Oct 04 03:18:32 crc kubenswrapper[4964]: I1004 03:18:32.010648 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6pg5r"] Oct 04 03:18:32 crc kubenswrapper[4964]: I1004 03:18:32.026314 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6pg5r"] Oct 04 03:18:32 crc kubenswrapper[4964]: I1004 03:18:32.027549 4964 scope.go:117] "RemoveContainer" containerID="fbdf5f4f82fae0a3027ff797ff654759ed4e9cc9c726215f5ca0ce43a1063922" Oct 04 03:18:32 crc kubenswrapper[4964]: I1004 03:18:32.051560 4964 scope.go:117] "RemoveContainer" containerID="4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8" Oct 04 03:18:32 crc kubenswrapper[4964]: E1004 03:18:32.051979 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8\": container with ID starting with 4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8 not found: ID does not exist" containerID="4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8" Oct 04 03:18:32 crc kubenswrapper[4964]: I1004 03:18:32.052035 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8"} err="failed to get container status \"4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8\": rpc error: code = NotFound desc = could not find container \"4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8\": container with ID starting with 4b0fd1f815f239b5a726ccc2eafca8e89e45504ef781907ce51a52c75a5319f8 not found: ID does not exist" Oct 04 03:18:32 crc kubenswrapper[4964]: I1004 03:18:32.052072 4964 scope.go:117] "RemoveContainer" containerID="632f6ff1aa17bd0a4ac75b8ffadbd1bb17e6de55264ce8667b910c87b2c3c748" Oct 04 03:18:32 crc kubenswrapper[4964]: E1004 03:18:32.052361 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632f6ff1aa17bd0a4ac75b8ffadbd1bb17e6de55264ce8667b910c87b2c3c748\": container with ID starting with 632f6ff1aa17bd0a4ac75b8ffadbd1bb17e6de55264ce8667b910c87b2c3c748 not found: ID does not exist" containerID="632f6ff1aa17bd0a4ac75b8ffadbd1bb17e6de55264ce8667b910c87b2c3c748" Oct 04 03:18:32 crc kubenswrapper[4964]: I1004 03:18:32.052409 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632f6ff1aa17bd0a4ac75b8ffadbd1bb17e6de55264ce8667b910c87b2c3c748"} err="failed to get container status \"632f6ff1aa17bd0a4ac75b8ffadbd1bb17e6de55264ce8667b910c87b2c3c748\": rpc error: code = NotFound desc = could not find container \"632f6ff1aa17bd0a4ac75b8ffadbd1bb17e6de55264ce8667b910c87b2c3c748\": container with ID starting with 632f6ff1aa17bd0a4ac75b8ffadbd1bb17e6de55264ce8667b910c87b2c3c748 not found: ID does not exist" Oct 04 03:18:32 crc kubenswrapper[4964]: I1004 03:18:32.052450 4964 scope.go:117] "RemoveContainer" containerID="fbdf5f4f82fae0a3027ff797ff654759ed4e9cc9c726215f5ca0ce43a1063922" Oct 04 03:18:32 crc kubenswrapper[4964]: E1004 03:18:32.052778 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbdf5f4f82fae0a3027ff797ff654759ed4e9cc9c726215f5ca0ce43a1063922\": container with ID starting with fbdf5f4f82fae0a3027ff797ff654759ed4e9cc9c726215f5ca0ce43a1063922 not found: ID does not exist" containerID="fbdf5f4f82fae0a3027ff797ff654759ed4e9cc9c726215f5ca0ce43a1063922" Oct 04 03:18:32 crc kubenswrapper[4964]: I1004 03:18:32.052815 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbdf5f4f82fae0a3027ff797ff654759ed4e9cc9c726215f5ca0ce43a1063922"} err="failed to get container status \"fbdf5f4f82fae0a3027ff797ff654759ed4e9cc9c726215f5ca0ce43a1063922\": rpc error: code = NotFound desc = could not find container \"fbdf5f4f82fae0a3027ff797ff654759ed4e9cc9c726215f5ca0ce43a1063922\": container with ID starting with fbdf5f4f82fae0a3027ff797ff654759ed4e9cc9c726215f5ca0ce43a1063922 not found: ID does not exist" Oct 04 03:18:32 crc kubenswrapper[4964]: I1004 03:18:32.861018 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11d48a73-1170-482c-9c31-8e9b0e5d4096" path="/var/lib/kubelet/pods/11d48a73-1170-482c-9c31-8e9b0e5d4096/volumes" Oct 04 03:18:37 crc kubenswrapper[4964]: I1004 03:18:37.113741 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pxkwr" podUID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerName="registry-server" probeResult="failure" output=< Oct 04 03:18:37 crc kubenswrapper[4964]: timeout: failed to connect service ":50051" within 1s Oct 04 03:18:37 crc kubenswrapper[4964]: > Oct 04 03:18:43 crc kubenswrapper[4964]: I1004 03:18:43.845178 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:18:43 crc kubenswrapper[4964]: E1004 03:18:43.846986 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:18:46 crc kubenswrapper[4964]: I1004 03:18:46.138309 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:46 crc kubenswrapper[4964]: I1004 03:18:46.212914 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:46 crc kubenswrapper[4964]: I1004 03:18:46.925831 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pxkwr"] Oct 04 03:18:48 crc kubenswrapper[4964]: I1004 03:18:48.136856 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pxkwr" podUID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerName="registry-server" containerID="cri-o://17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7" gracePeriod=2 Oct 04 03:18:48 crc kubenswrapper[4964]: I1004 03:18:48.628020 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:48 crc kubenswrapper[4964]: I1004 03:18:48.797944 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c31ba20-4f86-409e-815f-77ff351c1e39-catalog-content\") pod \"8c31ba20-4f86-409e-815f-77ff351c1e39\" (UID: \"8c31ba20-4f86-409e-815f-77ff351c1e39\") " Oct 04 03:18:48 crc kubenswrapper[4964]: I1004 03:18:48.798095 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c31ba20-4f86-409e-815f-77ff351c1e39-utilities\") pod \"8c31ba20-4f86-409e-815f-77ff351c1e39\" (UID: \"8c31ba20-4f86-409e-815f-77ff351c1e39\") " Oct 04 03:18:48 crc kubenswrapper[4964]: I1004 03:18:48.798214 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpv6q\" (UniqueName: \"kubernetes.io/projected/8c31ba20-4f86-409e-815f-77ff351c1e39-kube-api-access-vpv6q\") pod \"8c31ba20-4f86-409e-815f-77ff351c1e39\" (UID: \"8c31ba20-4f86-409e-815f-77ff351c1e39\") " Oct 04 03:18:48 crc kubenswrapper[4964]: I1004 03:18:48.798855 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c31ba20-4f86-409e-815f-77ff351c1e39-utilities" (OuterVolumeSpecName: "utilities") pod "8c31ba20-4f86-409e-815f-77ff351c1e39" (UID: "8c31ba20-4f86-409e-815f-77ff351c1e39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:18:48 crc kubenswrapper[4964]: I1004 03:18:48.808503 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c31ba20-4f86-409e-815f-77ff351c1e39-kube-api-access-vpv6q" (OuterVolumeSpecName: "kube-api-access-vpv6q") pod "8c31ba20-4f86-409e-815f-77ff351c1e39" (UID: "8c31ba20-4f86-409e-815f-77ff351c1e39"). InnerVolumeSpecName "kube-api-access-vpv6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:18:48 crc kubenswrapper[4964]: I1004 03:18:48.900209 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c31ba20-4f86-409e-815f-77ff351c1e39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c31ba20-4f86-409e-815f-77ff351c1e39" (UID: "8c31ba20-4f86-409e-815f-77ff351c1e39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:18:48 crc kubenswrapper[4964]: I1004 03:18:48.900639 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c31ba20-4f86-409e-815f-77ff351c1e39-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:48 crc kubenswrapper[4964]: I1004 03:18:48.900663 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpv6q\" (UniqueName: \"kubernetes.io/projected/8c31ba20-4f86-409e-815f-77ff351c1e39-kube-api-access-vpv6q\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:48 crc kubenswrapper[4964]: I1004 03:18:48.900674 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c31ba20-4f86-409e-815f-77ff351c1e39-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.149299 4964 generic.go:334] "Generic (PLEG): container finished" podID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerID="17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7" exitCode=0 Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.149340 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxkwr" event={"ID":"8c31ba20-4f86-409e-815f-77ff351c1e39","Type":"ContainerDied","Data":"17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7"} Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.149393 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxkwr" Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.149422 4964 scope.go:117] "RemoveContainer" containerID="17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7" Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.149403 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxkwr" event={"ID":"8c31ba20-4f86-409e-815f-77ff351c1e39","Type":"ContainerDied","Data":"0dcc1523984035b7175a7aa29f70bd23745bd99b5b0bad7ce738f5b1b397bc65"} Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.181911 4964 scope.go:117] "RemoveContainer" containerID="6450cd990d1a4ba34274ce8a86e357fccd8dad43695c8a02d1d2467ba1039edc" Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.192907 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pxkwr"] Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.198655 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pxkwr"] Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.212164 4964 scope.go:117] "RemoveContainer" containerID="d5e1db92028a9073629869cf7ffcb7afcd786c526fd5e2519e3acda9b228501f" Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.262761 4964 scope.go:117] "RemoveContainer" containerID="17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7" Oct 04 03:18:49 crc kubenswrapper[4964]: E1004 03:18:49.263240 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7\": container with ID starting with 17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7 not found: ID does not exist" containerID="17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7" Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.263279 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7"} err="failed to get container status \"17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7\": rpc error: code = NotFound desc = could not find container \"17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7\": container with ID starting with 17d763a1c4f34ac27b60240073a57d403b858252c1b82e2c0b13feb985a6c9d7 not found: ID does not exist" Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.263305 4964 scope.go:117] "RemoveContainer" containerID="6450cd990d1a4ba34274ce8a86e357fccd8dad43695c8a02d1d2467ba1039edc" Oct 04 03:18:49 crc kubenswrapper[4964]: E1004 03:18:49.263733 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6450cd990d1a4ba34274ce8a86e357fccd8dad43695c8a02d1d2467ba1039edc\": container with ID starting with 6450cd990d1a4ba34274ce8a86e357fccd8dad43695c8a02d1d2467ba1039edc not found: ID does not exist" containerID="6450cd990d1a4ba34274ce8a86e357fccd8dad43695c8a02d1d2467ba1039edc" Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.263764 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6450cd990d1a4ba34274ce8a86e357fccd8dad43695c8a02d1d2467ba1039edc"} err="failed to get container status \"6450cd990d1a4ba34274ce8a86e357fccd8dad43695c8a02d1d2467ba1039edc\": rpc error: code = NotFound desc = could not find container \"6450cd990d1a4ba34274ce8a86e357fccd8dad43695c8a02d1d2467ba1039edc\": container with ID starting with 6450cd990d1a4ba34274ce8a86e357fccd8dad43695c8a02d1d2467ba1039edc not found: ID does not exist" Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.263783 4964 scope.go:117] "RemoveContainer" containerID="d5e1db92028a9073629869cf7ffcb7afcd786c526fd5e2519e3acda9b228501f" Oct 04 03:18:49 crc kubenswrapper[4964]: E1004 03:18:49.264254 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5e1db92028a9073629869cf7ffcb7afcd786c526fd5e2519e3acda9b228501f\": container with ID starting with d5e1db92028a9073629869cf7ffcb7afcd786c526fd5e2519e3acda9b228501f not found: ID does not exist" containerID="d5e1db92028a9073629869cf7ffcb7afcd786c526fd5e2519e3acda9b228501f" Oct 04 03:18:49 crc kubenswrapper[4964]: I1004 03:18:49.264302 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5e1db92028a9073629869cf7ffcb7afcd786c526fd5e2519e3acda9b228501f"} err="failed to get container status \"d5e1db92028a9073629869cf7ffcb7afcd786c526fd5e2519e3acda9b228501f\": rpc error: code = NotFound desc = could not find container \"d5e1db92028a9073629869cf7ffcb7afcd786c526fd5e2519e3acda9b228501f\": container with ID starting with d5e1db92028a9073629869cf7ffcb7afcd786c526fd5e2519e3acda9b228501f not found: ID does not exist" Oct 04 03:18:50 crc kubenswrapper[4964]: I1004 03:18:50.860335 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c31ba20-4f86-409e-815f-77ff351c1e39" path="/var/lib/kubelet/pods/8c31ba20-4f86-409e-815f-77ff351c1e39/volumes" Oct 04 03:18:54 crc kubenswrapper[4964]: I1004 03:18:54.846297 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:18:54 crc kubenswrapper[4964]: E1004 03:18:54.847306 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:19:03 crc kubenswrapper[4964]: I1004 03:19:03.283245 4964 generic.go:334] "Generic (PLEG): container finished" podID="37d65c2b-caa2-47f0-917c-059ff235e83d" containerID="de8e63c3c9255fc9b3b07c766b8b6c8f4c1cf8502e6b05be3a598dc758b08be2" exitCode=0 Oct 04 03:19:03 crc kubenswrapper[4964]: I1004 03:19:03.283331 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" event={"ID":"37d65c2b-caa2-47f0-917c-059ff235e83d","Type":"ContainerDied","Data":"de8e63c3c9255fc9b3b07c766b8b6c8f4c1cf8502e6b05be3a598dc758b08be2"} Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.750196 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.838844 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ssh-key\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.838966 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ceph\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.839011 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ovn-combined-ca-bundle\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.839057 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.839091 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-libvirt-combined-ca-bundle\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.839134 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smrjc\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-kube-api-access-smrjc\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.839157 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-neutron-metadata-combined-ca-bundle\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.839217 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.839280 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.839361 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-nova-combined-ca-bundle\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.839402 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-bootstrap-combined-ca-bundle\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.839464 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-inventory\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.839490 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-repo-setup-combined-ca-bundle\") pod \"37d65c2b-caa2-47f0-917c-059ff235e83d\" (UID: \"37d65c2b-caa2-47f0-917c-059ff235e83d\") " Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.850075 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.850254 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-kube-api-access-smrjc" (OuterVolumeSpecName: "kube-api-access-smrjc") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "kube-api-access-smrjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.850322 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.850387 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.850400 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ceph" (OuterVolumeSpecName: "ceph") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.850839 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.850999 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.851020 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.851535 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.853523 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.853849 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.881224 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-inventory" (OuterVolumeSpecName: "inventory") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.896869 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "37d65c2b-caa2-47f0-917c-059ff235e83d" (UID: "37d65c2b-caa2-47f0-917c-059ff235e83d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.942850 4964 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.943044 4964 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.943176 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.943298 4964 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.943540 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.943696 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.943893 4964 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.944050 4964 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.944177 4964 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.944470 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smrjc\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-kube-api-access-smrjc\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.944655 4964 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d65c2b-caa2-47f0-917c-059ff235e83d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.944789 4964 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:04 crc kubenswrapper[4964]: I1004 03:19:04.944977 4964 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/37d65c2b-caa2-47f0-917c-059ff235e83d-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.306740 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" event={"ID":"37d65c2b-caa2-47f0-917c-059ff235e83d","Type":"ContainerDied","Data":"b0fe68bab74ef8108be956e4e7e4ce3a3a4e7711b48d9b665d0ddba305e30d0d"} Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.306797 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0fe68bab74ef8108be956e4e7e4ce3a3a4e7711b48d9b665d0ddba305e30d0d" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.306864 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.434138 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn"] Oct 04 03:19:05 crc kubenswrapper[4964]: E1004 03:19:05.434865 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerName="registry-server" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.434895 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerName="registry-server" Oct 04 03:19:05 crc kubenswrapper[4964]: E1004 03:19:05.434907 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerName="extract-utilities" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.434914 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerName="extract-utilities" Oct 04 03:19:05 crc kubenswrapper[4964]: E1004 03:19:05.434925 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d48a73-1170-482c-9c31-8e9b0e5d4096" containerName="extract-content" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.434932 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d48a73-1170-482c-9c31-8e9b0e5d4096" containerName="extract-content" Oct 04 03:19:05 crc kubenswrapper[4964]: E1004 03:19:05.434944 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d65c2b-caa2-47f0-917c-059ff235e83d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.434952 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d65c2b-caa2-47f0-917c-059ff235e83d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 04 03:19:05 crc kubenswrapper[4964]: E1004 03:19:05.434971 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d48a73-1170-482c-9c31-8e9b0e5d4096" containerName="extract-utilities" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.434977 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d48a73-1170-482c-9c31-8e9b0e5d4096" containerName="extract-utilities" Oct 04 03:19:05 crc kubenswrapper[4964]: E1004 03:19:05.434987 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerName="extract-content" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.434994 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerName="extract-content" Oct 04 03:19:05 crc kubenswrapper[4964]: E1004 03:19:05.435008 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d48a73-1170-482c-9c31-8e9b0e5d4096" containerName="registry-server" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.435014 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d48a73-1170-482c-9c31-8e9b0e5d4096" containerName="registry-server" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.435182 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c31ba20-4f86-409e-815f-77ff351c1e39" containerName="registry-server" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.435194 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d65c2b-caa2-47f0-917c-059ff235e83d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.435204 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d48a73-1170-482c-9c31-8e9b0e5d4096" containerName="registry-server" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.435769 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.439087 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.439321 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.445192 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.445352 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.451172 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.453672 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.453769 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.453838 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.453972 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b86r8\" (UniqueName: \"kubernetes.io/projected/e1160df4-8a02-4583-8edb-acdb474fd9e0-kube-api-access-b86r8\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.456006 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn"] Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.555772 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.555897 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.555951 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b86r8\" (UniqueName: \"kubernetes.io/projected/e1160df4-8a02-4583-8edb-acdb474fd9e0-kube-api-access-b86r8\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.556165 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.562671 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.569385 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.579525 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b86r8\" (UniqueName: \"kubernetes.io/projected/e1160df4-8a02-4583-8edb-acdb474fd9e0-kube-api-access-b86r8\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.580896 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:05 crc kubenswrapper[4964]: I1004 03:19:05.777734 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:06 crc kubenswrapper[4964]: I1004 03:19:06.449383 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn"] Oct 04 03:19:07 crc kubenswrapper[4964]: I1004 03:19:07.338908 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" event={"ID":"e1160df4-8a02-4583-8edb-acdb474fd9e0","Type":"ContainerStarted","Data":"4e6c2fe5dc81bd8c36575388b2e4ae13917a6c1b3d6b0a49c44e17670f7fb89d"} Oct 04 03:19:07 crc kubenswrapper[4964]: I1004 03:19:07.846841 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:19:07 crc kubenswrapper[4964]: E1004 03:19:07.847102 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:19:08 crc kubenswrapper[4964]: I1004 03:19:08.356537 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" event={"ID":"e1160df4-8a02-4583-8edb-acdb474fd9e0","Type":"ContainerStarted","Data":"32380d314a6afcc28c30885f4ca1544ff79a85b5c54faa005cafef2529279e49"} Oct 04 03:19:13 crc kubenswrapper[4964]: I1004 03:19:13.414099 4964 generic.go:334] "Generic (PLEG): container finished" podID="e1160df4-8a02-4583-8edb-acdb474fd9e0" containerID="32380d314a6afcc28c30885f4ca1544ff79a85b5c54faa005cafef2529279e49" exitCode=0 Oct 04 03:19:13 crc kubenswrapper[4964]: I1004 03:19:13.414227 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" event={"ID":"e1160df4-8a02-4583-8edb-acdb474fd9e0","Type":"ContainerDied","Data":"32380d314a6afcc28c30885f4ca1544ff79a85b5c54faa005cafef2529279e49"} Oct 04 03:19:14 crc kubenswrapper[4964]: I1004 03:19:14.966122 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.081782 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-ssh-key\") pod \"e1160df4-8a02-4583-8edb-acdb474fd9e0\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.082166 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-ceph\") pod \"e1160df4-8a02-4583-8edb-acdb474fd9e0\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.082321 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b86r8\" (UniqueName: \"kubernetes.io/projected/e1160df4-8a02-4583-8edb-acdb474fd9e0-kube-api-access-b86r8\") pod \"e1160df4-8a02-4583-8edb-acdb474fd9e0\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.082433 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-inventory\") pod \"e1160df4-8a02-4583-8edb-acdb474fd9e0\" (UID: \"e1160df4-8a02-4583-8edb-acdb474fd9e0\") " Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.089155 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1160df4-8a02-4583-8edb-acdb474fd9e0-kube-api-access-b86r8" (OuterVolumeSpecName: "kube-api-access-b86r8") pod "e1160df4-8a02-4583-8edb-acdb474fd9e0" (UID: "e1160df4-8a02-4583-8edb-acdb474fd9e0"). InnerVolumeSpecName "kube-api-access-b86r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.090453 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-ceph" (OuterVolumeSpecName: "ceph") pod "e1160df4-8a02-4583-8edb-acdb474fd9e0" (UID: "e1160df4-8a02-4583-8edb-acdb474fd9e0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.123435 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e1160df4-8a02-4583-8edb-acdb474fd9e0" (UID: "e1160df4-8a02-4583-8edb-acdb474fd9e0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.127770 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-inventory" (OuterVolumeSpecName: "inventory") pod "e1160df4-8a02-4583-8edb-acdb474fd9e0" (UID: "e1160df4-8a02-4583-8edb-acdb474fd9e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.185570 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.185607 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.185631 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1160df4-8a02-4583-8edb-acdb474fd9e0-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.185644 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b86r8\" (UniqueName: \"kubernetes.io/projected/e1160df4-8a02-4583-8edb-acdb474fd9e0-kube-api-access-b86r8\") on node \"crc\" DevicePath \"\"" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.438483 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" event={"ID":"e1160df4-8a02-4583-8edb-acdb474fd9e0","Type":"ContainerDied","Data":"4e6c2fe5dc81bd8c36575388b2e4ae13917a6c1b3d6b0a49c44e17670f7fb89d"} Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.438541 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e6c2fe5dc81bd8c36575388b2e4ae13917a6c1b3d6b0a49c44e17670f7fb89d" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.438574 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.565933 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5"] Oct 04 03:19:15 crc kubenswrapper[4964]: E1004 03:19:15.566390 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1160df4-8a02-4583-8edb-acdb474fd9e0" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.566412 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1160df4-8a02-4583-8edb-acdb474fd9e0" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.566655 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1160df4-8a02-4583-8edb-acdb474fd9e0" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.567408 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.576252 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.576314 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.576252 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.576428 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.576452 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.576863 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.582658 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5"] Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.694290 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.694360 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.694505 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.694565 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzjhh\" (UniqueName: \"kubernetes.io/projected/8b34ed53-2408-4cab-9c93-eed783a2f31c-kube-api-access-tzjhh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.694652 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b34ed53-2408-4cab-9c93-eed783a2f31c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.694694 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.797252 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.797435 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.797470 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.797534 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.797558 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzjhh\" (UniqueName: \"kubernetes.io/projected/8b34ed53-2408-4cab-9c93-eed783a2f31c-kube-api-access-tzjhh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.797660 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b34ed53-2408-4cab-9c93-eed783a2f31c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.798841 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b34ed53-2408-4cab-9c93-eed783a2f31c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.802638 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.803942 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.804114 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.805787 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.824751 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzjhh\" (UniqueName: \"kubernetes.io/projected/8b34ed53-2408-4cab-9c93-eed783a2f31c-kube-api-access-tzjhh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-84rt5\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:15 crc kubenswrapper[4964]: I1004 03:19:15.899799 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:19:16 crc kubenswrapper[4964]: I1004 03:19:16.479020 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5"] Oct 04 03:19:17 crc kubenswrapper[4964]: I1004 03:19:17.457061 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" event={"ID":"8b34ed53-2408-4cab-9c93-eed783a2f31c","Type":"ContainerStarted","Data":"92a30f42066036c1d05ff2c21c643dbb16db94ba9ea1771580f815e20694b4d8"} Oct 04 03:19:17 crc kubenswrapper[4964]: I1004 03:19:17.457438 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" event={"ID":"8b34ed53-2408-4cab-9c93-eed783a2f31c","Type":"ContainerStarted","Data":"3d9f62c17a139494744ce273d214207bafe975d14509e423c0c760c48b72910d"} Oct 04 03:19:17 crc kubenswrapper[4964]: I1004 03:19:17.484146 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" podStartSLOduration=1.813403562 podStartE2EDuration="2.484130003s" podCreationTimestamp="2025-10-04 03:19:15 +0000 UTC" firstStartedPulling="2025-10-04 03:19:16.493202174 +0000 UTC m=+2336.390160812" lastFinishedPulling="2025-10-04 03:19:17.163928575 +0000 UTC m=+2337.060887253" observedRunningTime="2025-10-04 03:19:17.481050792 +0000 UTC m=+2337.378009460" watchObservedRunningTime="2025-10-04 03:19:17.484130003 +0000 UTC m=+2337.381088641" Oct 04 03:19:18 crc kubenswrapper[4964]: I1004 03:19:18.846466 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:19:18 crc kubenswrapper[4964]: E1004 03:19:18.847357 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:19:33 crc kubenswrapper[4964]: I1004 03:19:33.846555 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:19:33 crc kubenswrapper[4964]: E1004 03:19:33.847996 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:19:48 crc kubenswrapper[4964]: I1004 03:19:48.845307 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:19:48 crc kubenswrapper[4964]: E1004 03:19:48.846052 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:20:00 crc kubenswrapper[4964]: I1004 03:20:00.856132 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:20:00 crc kubenswrapper[4964]: E1004 03:20:00.857273 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:20:11 crc kubenswrapper[4964]: I1004 03:20:11.845876 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:20:11 crc kubenswrapper[4964]: E1004 03:20:11.846950 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.525844 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fsdsj"] Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.529090 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.546761 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fsdsj"] Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.559079 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-catalog-content\") pod \"certified-operators-fsdsj\" (UID: \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\") " pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.559138 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m452c\" (UniqueName: \"kubernetes.io/projected/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-kube-api-access-m452c\") pod \"certified-operators-fsdsj\" (UID: \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\") " pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.559200 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-utilities\") pod \"certified-operators-fsdsj\" (UID: \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\") " pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.660055 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-catalog-content\") pod \"certified-operators-fsdsj\" (UID: \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\") " pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.660108 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m452c\" (UniqueName: \"kubernetes.io/projected/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-kube-api-access-m452c\") pod \"certified-operators-fsdsj\" (UID: \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\") " pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.660157 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-utilities\") pod \"certified-operators-fsdsj\" (UID: \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\") " pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.660641 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-catalog-content\") pod \"certified-operators-fsdsj\" (UID: \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\") " pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.660674 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-utilities\") pod \"certified-operators-fsdsj\" (UID: \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\") " pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.681669 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m452c\" (UniqueName: \"kubernetes.io/projected/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-kube-api-access-m452c\") pod \"certified-operators-fsdsj\" (UID: \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\") " pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:18 crc kubenswrapper[4964]: I1004 03:20:18.878660 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:19 crc kubenswrapper[4964]: I1004 03:20:19.385523 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fsdsj"] Oct 04 03:20:19 crc kubenswrapper[4964]: W1004 03:20:19.394695 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2f3ff9d_de95_4fc8_b08c_ff498357d0dd.slice/crio-a6e870274cae5ac284296180a93cbe0ffbe6d3b58b4a78ea6d1ad25a092f5ee6 WatchSource:0}: Error finding container a6e870274cae5ac284296180a93cbe0ffbe6d3b58b4a78ea6d1ad25a092f5ee6: Status 404 returned error can't find the container with id a6e870274cae5ac284296180a93cbe0ffbe6d3b58b4a78ea6d1ad25a092f5ee6 Oct 04 03:20:20 crc kubenswrapper[4964]: I1004 03:20:20.049010 4964 generic.go:334] "Generic (PLEG): container finished" podID="c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" containerID="57662b11d778d6f6935ffaf705c83958dfa1702406359cbda0cff8ba2f60a839" exitCode=0 Oct 04 03:20:20 crc kubenswrapper[4964]: I1004 03:20:20.049068 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdsj" event={"ID":"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd","Type":"ContainerDied","Data":"57662b11d778d6f6935ffaf705c83958dfa1702406359cbda0cff8ba2f60a839"} Oct 04 03:20:20 crc kubenswrapper[4964]: I1004 03:20:20.049339 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdsj" event={"ID":"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd","Type":"ContainerStarted","Data":"a6e870274cae5ac284296180a93cbe0ffbe6d3b58b4a78ea6d1ad25a092f5ee6"} Oct 04 03:20:22 crc kubenswrapper[4964]: I1004 03:20:22.074259 4964 generic.go:334] "Generic (PLEG): container finished" podID="c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" containerID="187e84bc6b59b9f3f8d505c4ce131969aeef805497eee5ed1db01593ba9ad44b" exitCode=0 Oct 04 03:20:22 crc kubenswrapper[4964]: I1004 03:20:22.074399 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdsj" event={"ID":"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd","Type":"ContainerDied","Data":"187e84bc6b59b9f3f8d505c4ce131969aeef805497eee5ed1db01593ba9ad44b"} Oct 04 03:20:24 crc kubenswrapper[4964]: I1004 03:20:24.097755 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdsj" event={"ID":"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd","Type":"ContainerStarted","Data":"ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a"} Oct 04 03:20:24 crc kubenswrapper[4964]: I1004 03:20:24.118217 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fsdsj" podStartSLOduration=2.603953243 podStartE2EDuration="6.118198539s" podCreationTimestamp="2025-10-04 03:20:18 +0000 UTC" firstStartedPulling="2025-10-04 03:20:20.052262742 +0000 UTC m=+2399.949221380" lastFinishedPulling="2025-10-04 03:20:23.566508038 +0000 UTC m=+2403.463466676" observedRunningTime="2025-10-04 03:20:24.11636278 +0000 UTC m=+2404.013321438" watchObservedRunningTime="2025-10-04 03:20:24.118198539 +0000 UTC m=+2404.015157177" Oct 04 03:20:24 crc kubenswrapper[4964]: I1004 03:20:24.845553 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:20:24 crc kubenswrapper[4964]: E1004 03:20:24.845968 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:20:28 crc kubenswrapper[4964]: I1004 03:20:28.879634 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:28 crc kubenswrapper[4964]: I1004 03:20:28.880330 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:28 crc kubenswrapper[4964]: I1004 03:20:28.961922 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:29 crc kubenswrapper[4964]: I1004 03:20:29.223609 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:29 crc kubenswrapper[4964]: I1004 03:20:29.289281 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fsdsj"] Oct 04 03:20:31 crc kubenswrapper[4964]: I1004 03:20:31.164486 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fsdsj" podUID="c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" containerName="registry-server" containerID="cri-o://ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a" gracePeriod=2 Oct 04 03:20:31 crc kubenswrapper[4964]: I1004 03:20:31.638914 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:31 crc kubenswrapper[4964]: I1004 03:20:31.729191 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m452c\" (UniqueName: \"kubernetes.io/projected/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-kube-api-access-m452c\") pod \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\" (UID: \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\") " Oct 04 03:20:31 crc kubenswrapper[4964]: I1004 03:20:31.729329 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-catalog-content\") pod \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\" (UID: \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\") " Oct 04 03:20:31 crc kubenswrapper[4964]: I1004 03:20:31.729530 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-utilities\") pod \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\" (UID: \"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd\") " Oct 04 03:20:31 crc kubenswrapper[4964]: I1004 03:20:31.731283 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-utilities" (OuterVolumeSpecName: "utilities") pod "c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" (UID: "c2f3ff9d-de95-4fc8-b08c-ff498357d0dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:20:31 crc kubenswrapper[4964]: I1004 03:20:31.741066 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-kube-api-access-m452c" (OuterVolumeSpecName: "kube-api-access-m452c") pod "c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" (UID: "c2f3ff9d-de95-4fc8-b08c-ff498357d0dd"). InnerVolumeSpecName "kube-api-access-m452c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:20:31 crc kubenswrapper[4964]: I1004 03:20:31.831711 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:20:31 crc kubenswrapper[4964]: I1004 03:20:31.831980 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m452c\" (UniqueName: \"kubernetes.io/projected/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-kube-api-access-m452c\") on node \"crc\" DevicePath \"\"" Oct 04 03:20:31 crc kubenswrapper[4964]: I1004 03:20:31.945760 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" (UID: "c2f3ff9d-de95-4fc8-b08c-ff498357d0dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.037042 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.176332 4964 generic.go:334] "Generic (PLEG): container finished" podID="c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" containerID="ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a" exitCode=0 Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.176384 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdsj" event={"ID":"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd","Type":"ContainerDied","Data":"ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a"} Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.176414 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsdsj" Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.176451 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdsj" event={"ID":"c2f3ff9d-de95-4fc8-b08c-ff498357d0dd","Type":"ContainerDied","Data":"a6e870274cae5ac284296180a93cbe0ffbe6d3b58b4a78ea6d1ad25a092f5ee6"} Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.176487 4964 scope.go:117] "RemoveContainer" containerID="ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a" Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.212790 4964 scope.go:117] "RemoveContainer" containerID="187e84bc6b59b9f3f8d505c4ce131969aeef805497eee5ed1db01593ba9ad44b" Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.223285 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fsdsj"] Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.231922 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fsdsj"] Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.239595 4964 scope.go:117] "RemoveContainer" containerID="57662b11d778d6f6935ffaf705c83958dfa1702406359cbda0cff8ba2f60a839" Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.285852 4964 scope.go:117] "RemoveContainer" containerID="ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a" Oct 04 03:20:32 crc kubenswrapper[4964]: E1004 03:20:32.286197 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a\": container with ID starting with ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a not found: ID does not exist" containerID="ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a" Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.286234 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a"} err="failed to get container status \"ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a\": rpc error: code = NotFound desc = could not find container \"ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a\": container with ID starting with ceb5e7c05185573a9ea7b11f0eec8134733df9f5f098c02f37d25b328c0b0e9a not found: ID does not exist" Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.286257 4964 scope.go:117] "RemoveContainer" containerID="187e84bc6b59b9f3f8d505c4ce131969aeef805497eee5ed1db01593ba9ad44b" Oct 04 03:20:32 crc kubenswrapper[4964]: E1004 03:20:32.286606 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187e84bc6b59b9f3f8d505c4ce131969aeef805497eee5ed1db01593ba9ad44b\": container with ID starting with 187e84bc6b59b9f3f8d505c4ce131969aeef805497eee5ed1db01593ba9ad44b not found: ID does not exist" containerID="187e84bc6b59b9f3f8d505c4ce131969aeef805497eee5ed1db01593ba9ad44b" Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.286674 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187e84bc6b59b9f3f8d505c4ce131969aeef805497eee5ed1db01593ba9ad44b"} err="failed to get container status \"187e84bc6b59b9f3f8d505c4ce131969aeef805497eee5ed1db01593ba9ad44b\": rpc error: code = NotFound desc = could not find container \"187e84bc6b59b9f3f8d505c4ce131969aeef805497eee5ed1db01593ba9ad44b\": container with ID starting with 187e84bc6b59b9f3f8d505c4ce131969aeef805497eee5ed1db01593ba9ad44b not found: ID does not exist" Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.286708 4964 scope.go:117] "RemoveContainer" containerID="57662b11d778d6f6935ffaf705c83958dfa1702406359cbda0cff8ba2f60a839" Oct 04 03:20:32 crc kubenswrapper[4964]: E1004 03:20:32.287087 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57662b11d778d6f6935ffaf705c83958dfa1702406359cbda0cff8ba2f60a839\": container with ID starting with 57662b11d778d6f6935ffaf705c83958dfa1702406359cbda0cff8ba2f60a839 not found: ID does not exist" containerID="57662b11d778d6f6935ffaf705c83958dfa1702406359cbda0cff8ba2f60a839" Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.287118 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57662b11d778d6f6935ffaf705c83958dfa1702406359cbda0cff8ba2f60a839"} err="failed to get container status \"57662b11d778d6f6935ffaf705c83958dfa1702406359cbda0cff8ba2f60a839\": rpc error: code = NotFound desc = could not find container \"57662b11d778d6f6935ffaf705c83958dfa1702406359cbda0cff8ba2f60a839\": container with ID starting with 57662b11d778d6f6935ffaf705c83958dfa1702406359cbda0cff8ba2f60a839 not found: ID does not exist" Oct 04 03:20:32 crc kubenswrapper[4964]: I1004 03:20:32.863402 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" path="/var/lib/kubelet/pods/c2f3ff9d-de95-4fc8-b08c-ff498357d0dd/volumes" Oct 04 03:20:34 crc kubenswrapper[4964]: I1004 03:20:34.200164 4964 generic.go:334] "Generic (PLEG): container finished" podID="8b34ed53-2408-4cab-9c93-eed783a2f31c" containerID="92a30f42066036c1d05ff2c21c643dbb16db94ba9ea1771580f815e20694b4d8" exitCode=0 Oct 04 03:20:34 crc kubenswrapper[4964]: I1004 03:20:34.200299 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" event={"ID":"8b34ed53-2408-4cab-9c93-eed783a2f31c","Type":"ContainerDied","Data":"92a30f42066036c1d05ff2c21c643dbb16db94ba9ea1771580f815e20694b4d8"} Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.788097 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.823134 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ovn-combined-ca-bundle\") pod \"8b34ed53-2408-4cab-9c93-eed783a2f31c\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.823248 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ssh-key\") pod \"8b34ed53-2408-4cab-9c93-eed783a2f31c\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.824069 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzjhh\" (UniqueName: \"kubernetes.io/projected/8b34ed53-2408-4cab-9c93-eed783a2f31c-kube-api-access-tzjhh\") pod \"8b34ed53-2408-4cab-9c93-eed783a2f31c\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.824142 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b34ed53-2408-4cab-9c93-eed783a2f31c-ovncontroller-config-0\") pod \"8b34ed53-2408-4cab-9c93-eed783a2f31c\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.824173 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-inventory\") pod \"8b34ed53-2408-4cab-9c93-eed783a2f31c\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.824267 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ceph\") pod \"8b34ed53-2408-4cab-9c93-eed783a2f31c\" (UID: \"8b34ed53-2408-4cab-9c93-eed783a2f31c\") " Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.841051 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ceph" (OuterVolumeSpecName: "ceph") pod "8b34ed53-2408-4cab-9c93-eed783a2f31c" (UID: "8b34ed53-2408-4cab-9c93-eed783a2f31c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.841169 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8b34ed53-2408-4cab-9c93-eed783a2f31c" (UID: "8b34ed53-2408-4cab-9c93-eed783a2f31c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.842959 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b34ed53-2408-4cab-9c93-eed783a2f31c-kube-api-access-tzjhh" (OuterVolumeSpecName: "kube-api-access-tzjhh") pod "8b34ed53-2408-4cab-9c93-eed783a2f31c" (UID: "8b34ed53-2408-4cab-9c93-eed783a2f31c"). InnerVolumeSpecName "kube-api-access-tzjhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.860587 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8b34ed53-2408-4cab-9c93-eed783a2f31c" (UID: "8b34ed53-2408-4cab-9c93-eed783a2f31c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.866373 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-inventory" (OuterVolumeSpecName: "inventory") pod "8b34ed53-2408-4cab-9c93-eed783a2f31c" (UID: "8b34ed53-2408-4cab-9c93-eed783a2f31c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.871888 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b34ed53-2408-4cab-9c93-eed783a2f31c-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8b34ed53-2408-4cab-9c93-eed783a2f31c" (UID: "8b34ed53-2408-4cab-9c93-eed783a2f31c"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.926201 4964 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.926468 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.926539 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzjhh\" (UniqueName: \"kubernetes.io/projected/8b34ed53-2408-4cab-9c93-eed783a2f31c-kube-api-access-tzjhh\") on node \"crc\" DevicePath \"\"" Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.926599 4964 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b34ed53-2408-4cab-9c93-eed783a2f31c-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.926670 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:20:35 crc kubenswrapper[4964]: I1004 03:20:35.926742 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8b34ed53-2408-4cab-9c93-eed783a2f31c-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.219805 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" event={"ID":"8b34ed53-2408-4cab-9c93-eed783a2f31c","Type":"ContainerDied","Data":"3d9f62c17a139494744ce273d214207bafe975d14509e423c0c760c48b72910d"} Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.220067 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d9f62c17a139494744ce273d214207bafe975d14509e423c0c760c48b72910d" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.219862 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-84rt5" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.351209 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb"] Oct 04 03:20:36 crc kubenswrapper[4964]: E1004 03:20:36.351888 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" containerName="registry-server" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.352057 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" containerName="registry-server" Oct 04 03:20:36 crc kubenswrapper[4964]: E1004 03:20:36.352197 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" containerName="extract-content" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.352303 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" containerName="extract-content" Oct 04 03:20:36 crc kubenswrapper[4964]: E1004 03:20:36.352444 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" containerName="extract-utilities" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.352575 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" containerName="extract-utilities" Oct 04 03:20:36 crc kubenswrapper[4964]: E1004 03:20:36.352757 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b34ed53-2408-4cab-9c93-eed783a2f31c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.352886 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b34ed53-2408-4cab-9c93-eed783a2f31c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.353312 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b34ed53-2408-4cab-9c93-eed783a2f31c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.353466 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f3ff9d-de95-4fc8-b08c-ff498357d0dd" containerName="registry-server" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.354473 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.356500 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.356840 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.357071 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.356960 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.357035 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.357219 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.358004 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.364641 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb"] Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.435103 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.435182 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f582f\" (UniqueName: \"kubernetes.io/projected/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-kube-api-access-f582f\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.435214 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.435256 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.435283 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.435406 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.435470 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.537070 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.537129 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.537168 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.537202 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f582f\" (UniqueName: \"kubernetes.io/projected/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-kube-api-access-f582f\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.537227 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.537251 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.537278 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.542891 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.542893 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.542998 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.543359 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.544415 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.544534 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.568725 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f582f\" (UniqueName: \"kubernetes.io/projected/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-kube-api-access-f582f\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:36 crc kubenswrapper[4964]: I1004 03:20:36.673364 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:20:37 crc kubenswrapper[4964]: I1004 03:20:37.282871 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb"] Oct 04 03:20:37 crc kubenswrapper[4964]: W1004 03:20:37.287132 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e86be92_e0c6_4e1d_ba37_28f6c32a08d5.slice/crio-235df53ebea035228522f2fa2e95196a6300cfd646d324f69ec5d61650b522aa WatchSource:0}: Error finding container 235df53ebea035228522f2fa2e95196a6300cfd646d324f69ec5d61650b522aa: Status 404 returned error can't find the container with id 235df53ebea035228522f2fa2e95196a6300cfd646d324f69ec5d61650b522aa Oct 04 03:20:37 crc kubenswrapper[4964]: I1004 03:20:37.845519 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:20:37 crc kubenswrapper[4964]: E1004 03:20:37.846419 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:20:38 crc kubenswrapper[4964]: I1004 03:20:38.247205 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" event={"ID":"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5","Type":"ContainerStarted","Data":"041425ebc2024eccabe1edf379a65dafde1bba549379b377f8c2256f165f880e"} Oct 04 03:20:38 crc kubenswrapper[4964]: I1004 03:20:38.247544 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" event={"ID":"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5","Type":"ContainerStarted","Data":"235df53ebea035228522f2fa2e95196a6300cfd646d324f69ec5d61650b522aa"} Oct 04 03:20:38 crc kubenswrapper[4964]: I1004 03:20:38.278900 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" podStartSLOduration=1.7345108040000001 podStartE2EDuration="2.278880703s" podCreationTimestamp="2025-10-04 03:20:36 +0000 UTC" firstStartedPulling="2025-10-04 03:20:37.291026655 +0000 UTC m=+2417.187985303" lastFinishedPulling="2025-10-04 03:20:37.835396564 +0000 UTC m=+2417.732355202" observedRunningTime="2025-10-04 03:20:38.272129515 +0000 UTC m=+2418.169088183" watchObservedRunningTime="2025-10-04 03:20:38.278880703 +0000 UTC m=+2418.175839381" Oct 04 03:20:50 crc kubenswrapper[4964]: I1004 03:20:50.857231 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:20:50 crc kubenswrapper[4964]: E1004 03:20:50.858226 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:21:04 crc kubenswrapper[4964]: I1004 03:21:04.846071 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:21:04 crc kubenswrapper[4964]: E1004 03:21:04.846870 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:21:19 crc kubenswrapper[4964]: I1004 03:21:19.846102 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:21:19 crc kubenswrapper[4964]: E1004 03:21:19.846960 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:21:30 crc kubenswrapper[4964]: I1004 03:21:30.851036 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:21:30 crc kubenswrapper[4964]: E1004 03:21:30.851866 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:21:42 crc kubenswrapper[4964]: I1004 03:21:42.927839 4964 generic.go:334] "Generic (PLEG): container finished" podID="1e86be92-e0c6-4e1d-ba37-28f6c32a08d5" containerID="041425ebc2024eccabe1edf379a65dafde1bba549379b377f8c2256f165f880e" exitCode=0 Oct 04 03:21:42 crc kubenswrapper[4964]: I1004 03:21:42.927970 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" event={"ID":"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5","Type":"ContainerDied","Data":"041425ebc2024eccabe1edf379a65dafde1bba549379b377f8c2256f165f880e"} Oct 04 03:21:43 crc kubenswrapper[4964]: I1004 03:21:43.845719 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:21:43 crc kubenswrapper[4964]: E1004 03:21:43.846444 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.450902 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.539599 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.539963 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-inventory\") pod \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.540067 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f582f\" (UniqueName: \"kubernetes.io/projected/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-kube-api-access-f582f\") pod \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.540102 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-ceph\") pod \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.540245 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-neutron-metadata-combined-ca-bundle\") pod \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.540288 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-nova-metadata-neutron-config-0\") pod \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.540389 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-ssh-key\") pod \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\" (UID: \"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5\") " Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.545756 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-ceph" (OuterVolumeSpecName: "ceph") pod "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5" (UID: "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.546517 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5" (UID: "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.548085 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-kube-api-access-f582f" (OuterVolumeSpecName: "kube-api-access-f582f") pod "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5" (UID: "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5"). InnerVolumeSpecName "kube-api-access-f582f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.573044 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5" (UID: "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.576244 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5" (UID: "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.579760 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5" (UID: "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.587777 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-inventory" (OuterVolumeSpecName: "inventory") pod "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5" (UID: "1e86be92-e0c6-4e1d-ba37-28f6c32a08d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.643660 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.643699 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.643712 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f582f\" (UniqueName: \"kubernetes.io/projected/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-kube-api-access-f582f\") on node \"crc\" DevicePath \"\"" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.643729 4964 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.643743 4964 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.643755 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.643767 4964 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e86be92-e0c6-4e1d-ba37-28f6c32a08d5-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.954184 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" event={"ID":"1e86be92-e0c6-4e1d-ba37-28f6c32a08d5","Type":"ContainerDied","Data":"235df53ebea035228522f2fa2e95196a6300cfd646d324f69ec5d61650b522aa"} Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.954241 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="235df53ebea035228522f2fa2e95196a6300cfd646d324f69ec5d61650b522aa" Oct 04 03:21:44 crc kubenswrapper[4964]: I1004 03:21:44.954522 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.110797 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g"] Oct 04 03:21:45 crc kubenswrapper[4964]: E1004 03:21:45.111403 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e86be92-e0c6-4e1d-ba37-28f6c32a08d5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.111420 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e86be92-e0c6-4e1d-ba37-28f6c32a08d5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.111601 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e86be92-e0c6-4e1d-ba37-28f6c32a08d5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.112165 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.115964 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.116044 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.116234 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.116352 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.116479 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.116662 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.147486 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g"] Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.253068 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.253129 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.253185 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.253205 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.253226 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.253248 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fxfq\" (UniqueName: \"kubernetes.io/projected/31c841f6-d2c2-4557-8f03-ede13fec6dc0-kube-api-access-7fxfq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.354764 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.354808 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.354868 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.354886 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.354908 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.354949 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fxfq\" (UniqueName: \"kubernetes.io/projected/31c841f6-d2c2-4557-8f03-ede13fec6dc0-kube-api-access-7fxfq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.360510 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.361082 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.361275 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.361806 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.361997 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.371868 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fxfq\" (UniqueName: \"kubernetes.io/projected/31c841f6-d2c2-4557-8f03-ede13fec6dc0-kube-api-access-7fxfq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jj52g\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.435347 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.804534 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g"] Oct 04 03:21:45 crc kubenswrapper[4964]: I1004 03:21:45.965430 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" event={"ID":"31c841f6-d2c2-4557-8f03-ede13fec6dc0","Type":"ContainerStarted","Data":"7fa2c9c7872442d8a940d26009463f9f19c891f047a519b94d68e1ec5e41a1fc"} Oct 04 03:21:46 crc kubenswrapper[4964]: I1004 03:21:46.976298 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" event={"ID":"31c841f6-d2c2-4557-8f03-ede13fec6dc0","Type":"ContainerStarted","Data":"fd834c73ba0f6f0fd25c01177b17794c892e0b8bcbce8a841bb92cede745eb50"} Oct 04 03:21:47 crc kubenswrapper[4964]: I1004 03:21:47.014990 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" podStartSLOduration=1.473879943 podStartE2EDuration="2.014956824s" podCreationTimestamp="2025-10-04 03:21:45 +0000 UTC" firstStartedPulling="2025-10-04 03:21:45.808995658 +0000 UTC m=+2485.705954306" lastFinishedPulling="2025-10-04 03:21:46.350072509 +0000 UTC m=+2486.247031187" observedRunningTime="2025-10-04 03:21:46.998059916 +0000 UTC m=+2486.895018584" watchObservedRunningTime="2025-10-04 03:21:47.014956824 +0000 UTC m=+2486.911915502" Oct 04 03:21:56 crc kubenswrapper[4964]: I1004 03:21:56.845787 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:21:56 crc kubenswrapper[4964]: E1004 03:21:56.846882 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:22:08 crc kubenswrapper[4964]: I1004 03:22:08.845146 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:22:09 crc kubenswrapper[4964]: I1004 03:22:09.209352 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"f6e655ee4d1c5aeb52f7fb388a42fbe1ff76f1dce97ec4eb0258e257864d16ea"} Oct 04 03:24:34 crc kubenswrapper[4964]: I1004 03:24:34.449700 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:24:34 crc kubenswrapper[4964]: I1004 03:24:34.450365 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:25:04 crc kubenswrapper[4964]: I1004 03:25:04.449340 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:25:04 crc kubenswrapper[4964]: I1004 03:25:04.450000 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:25:34 crc kubenswrapper[4964]: I1004 03:25:34.449205 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:25:34 crc kubenswrapper[4964]: I1004 03:25:34.449820 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:25:34 crc kubenswrapper[4964]: I1004 03:25:34.449878 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 03:25:34 crc kubenswrapper[4964]: I1004 03:25:34.450687 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6e655ee4d1c5aeb52f7fb388a42fbe1ff76f1dce97ec4eb0258e257864d16ea"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 03:25:34 crc kubenswrapper[4964]: I1004 03:25:34.450753 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://f6e655ee4d1c5aeb52f7fb388a42fbe1ff76f1dce97ec4eb0258e257864d16ea" gracePeriod=600 Oct 04 03:25:35 crc kubenswrapper[4964]: I1004 03:25:35.478982 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="f6e655ee4d1c5aeb52f7fb388a42fbe1ff76f1dce97ec4eb0258e257864d16ea" exitCode=0 Oct 04 03:25:35 crc kubenswrapper[4964]: I1004 03:25:35.479070 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"f6e655ee4d1c5aeb52f7fb388a42fbe1ff76f1dce97ec4eb0258e257864d16ea"} Oct 04 03:25:35 crc kubenswrapper[4964]: I1004 03:25:35.479680 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d"} Oct 04 03:25:35 crc kubenswrapper[4964]: I1004 03:25:35.479724 4964 scope.go:117] "RemoveContainer" containerID="034ff8cfc7970ce961d63eb73377afbe18c08abdb5c77f91b0f4d90d3dff6844" Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.258507 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-skbm8"] Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.262812 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.290518 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-skbm8"] Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.382920 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed669bd-6425-4e55-9353-9d719b8bb328-utilities\") pod \"redhat-marketplace-skbm8\" (UID: \"5ed669bd-6425-4e55-9353-9d719b8bb328\") " pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.383066 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed669bd-6425-4e55-9353-9d719b8bb328-catalog-content\") pod \"redhat-marketplace-skbm8\" (UID: \"5ed669bd-6425-4e55-9353-9d719b8bb328\") " pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.383228 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v4bl\" (UniqueName: \"kubernetes.io/projected/5ed669bd-6425-4e55-9353-9d719b8bb328-kube-api-access-7v4bl\") pod \"redhat-marketplace-skbm8\" (UID: \"5ed669bd-6425-4e55-9353-9d719b8bb328\") " pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.485965 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed669bd-6425-4e55-9353-9d719b8bb328-utilities\") pod \"redhat-marketplace-skbm8\" (UID: \"5ed669bd-6425-4e55-9353-9d719b8bb328\") " pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.486053 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed669bd-6425-4e55-9353-9d719b8bb328-catalog-content\") pod \"redhat-marketplace-skbm8\" (UID: \"5ed669bd-6425-4e55-9353-9d719b8bb328\") " pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.486164 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v4bl\" (UniqueName: \"kubernetes.io/projected/5ed669bd-6425-4e55-9353-9d719b8bb328-kube-api-access-7v4bl\") pod \"redhat-marketplace-skbm8\" (UID: \"5ed669bd-6425-4e55-9353-9d719b8bb328\") " pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.486688 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed669bd-6425-4e55-9353-9d719b8bb328-utilities\") pod \"redhat-marketplace-skbm8\" (UID: \"5ed669bd-6425-4e55-9353-9d719b8bb328\") " pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.486739 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed669bd-6425-4e55-9353-9d719b8bb328-catalog-content\") pod \"redhat-marketplace-skbm8\" (UID: \"5ed669bd-6425-4e55-9353-9d719b8bb328\") " pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.511673 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v4bl\" (UniqueName: \"kubernetes.io/projected/5ed669bd-6425-4e55-9353-9d719b8bb328-kube-api-access-7v4bl\") pod \"redhat-marketplace-skbm8\" (UID: \"5ed669bd-6425-4e55-9353-9d719b8bb328\") " pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:29 crc kubenswrapper[4964]: I1004 03:26:29.613753 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:30 crc kubenswrapper[4964]: I1004 03:26:30.140577 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-skbm8"] Oct 04 03:26:31 crc kubenswrapper[4964]: I1004 03:26:31.124692 4964 generic.go:334] "Generic (PLEG): container finished" podID="5ed669bd-6425-4e55-9353-9d719b8bb328" containerID="d3804b44daa6ced664961dede103e53a97158b811cfcbd22df3d0d2bbe52fe40" exitCode=0 Oct 04 03:26:31 crc kubenswrapper[4964]: I1004 03:26:31.124829 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skbm8" event={"ID":"5ed669bd-6425-4e55-9353-9d719b8bb328","Type":"ContainerDied","Data":"d3804b44daa6ced664961dede103e53a97158b811cfcbd22df3d0d2bbe52fe40"} Oct 04 03:26:31 crc kubenswrapper[4964]: I1004 03:26:31.125177 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skbm8" event={"ID":"5ed669bd-6425-4e55-9353-9d719b8bb328","Type":"ContainerStarted","Data":"0ce3e681b82bf88828968198684d9be2765acacb6650be36d1ae08ebba9ec1e4"} Oct 04 03:26:31 crc kubenswrapper[4964]: I1004 03:26:31.130092 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 03:26:33 crc kubenswrapper[4964]: I1004 03:26:33.149804 4964 generic.go:334] "Generic (PLEG): container finished" podID="5ed669bd-6425-4e55-9353-9d719b8bb328" containerID="e7db55e76a53e50fdbc3a1f5044ff426aa73057caffd2d67c177fb54ebff8253" exitCode=0 Oct 04 03:26:33 crc kubenswrapper[4964]: I1004 03:26:33.149856 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skbm8" event={"ID":"5ed669bd-6425-4e55-9353-9d719b8bb328","Type":"ContainerDied","Data":"e7db55e76a53e50fdbc3a1f5044ff426aa73057caffd2d67c177fb54ebff8253"} Oct 04 03:26:34 crc kubenswrapper[4964]: I1004 03:26:34.166517 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skbm8" event={"ID":"5ed669bd-6425-4e55-9353-9d719b8bb328","Type":"ContainerStarted","Data":"96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3"} Oct 04 03:26:34 crc kubenswrapper[4964]: I1004 03:26:34.207052 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-skbm8" podStartSLOduration=2.793041435 podStartE2EDuration="5.207026451s" podCreationTimestamp="2025-10-04 03:26:29 +0000 UTC" firstStartedPulling="2025-10-04 03:26:31.128530713 +0000 UTC m=+2771.025489391" lastFinishedPulling="2025-10-04 03:26:33.542515729 +0000 UTC m=+2773.439474407" observedRunningTime="2025-10-04 03:26:34.199161003 +0000 UTC m=+2774.096119651" watchObservedRunningTime="2025-10-04 03:26:34.207026451 +0000 UTC m=+2774.103985109" Oct 04 03:26:38 crc kubenswrapper[4964]: I1004 03:26:38.219499 4964 generic.go:334] "Generic (PLEG): container finished" podID="31c841f6-d2c2-4557-8f03-ede13fec6dc0" containerID="fd834c73ba0f6f0fd25c01177b17794c892e0b8bcbce8a841bb92cede745eb50" exitCode=0 Oct 04 03:26:38 crc kubenswrapper[4964]: I1004 03:26:38.219591 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" event={"ID":"31c841f6-d2c2-4557-8f03-ede13fec6dc0","Type":"ContainerDied","Data":"fd834c73ba0f6f0fd25c01177b17794c892e0b8bcbce8a841bb92cede745eb50"} Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.615361 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.615810 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.679161 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.692164 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.808177 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-inventory\") pod \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.808239 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fxfq\" (UniqueName: \"kubernetes.io/projected/31c841f6-d2c2-4557-8f03-ede13fec6dc0-kube-api-access-7fxfq\") pod \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.808282 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-ssh-key\") pod \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.808315 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-libvirt-secret-0\") pod \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.808343 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-libvirt-combined-ca-bundle\") pod \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.808433 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-ceph\") pod \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\" (UID: \"31c841f6-d2c2-4557-8f03-ede13fec6dc0\") " Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.813671 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-ceph" (OuterVolumeSpecName: "ceph") pod "31c841f6-d2c2-4557-8f03-ede13fec6dc0" (UID: "31c841f6-d2c2-4557-8f03-ede13fec6dc0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.813839 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31c841f6-d2c2-4557-8f03-ede13fec6dc0-kube-api-access-7fxfq" (OuterVolumeSpecName: "kube-api-access-7fxfq") pod "31c841f6-d2c2-4557-8f03-ede13fec6dc0" (UID: "31c841f6-d2c2-4557-8f03-ede13fec6dc0"). InnerVolumeSpecName "kube-api-access-7fxfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.813982 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "31c841f6-d2c2-4557-8f03-ede13fec6dc0" (UID: "31c841f6-d2c2-4557-8f03-ede13fec6dc0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.832787 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-inventory" (OuterVolumeSpecName: "inventory") pod "31c841f6-d2c2-4557-8f03-ede13fec6dc0" (UID: "31c841f6-d2c2-4557-8f03-ede13fec6dc0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.835332 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "31c841f6-d2c2-4557-8f03-ede13fec6dc0" (UID: "31c841f6-d2c2-4557-8f03-ede13fec6dc0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.854550 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "31c841f6-d2c2-4557-8f03-ede13fec6dc0" (UID: "31c841f6-d2c2-4557-8f03-ede13fec6dc0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.910051 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.910082 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.910093 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fxfq\" (UniqueName: \"kubernetes.io/projected/31c841f6-d2c2-4557-8f03-ede13fec6dc0-kube-api-access-7fxfq\") on node \"crc\" DevicePath \"\"" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.910101 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.910110 4964 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:26:39 crc kubenswrapper[4964]: I1004 03:26:39.910118 4964 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31c841f6-d2c2-4557-8f03-ede13fec6dc0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.241462 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" event={"ID":"31c841f6-d2c2-4557-8f03-ede13fec6dc0","Type":"ContainerDied","Data":"7fa2c9c7872442d8a940d26009463f9f19c891f047a519b94d68e1ec5e41a1fc"} Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.241802 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fa2c9c7872442d8a940d26009463f9f19c891f047a519b94d68e1ec5e41a1fc" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.241570 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jj52g" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.337701 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.360959 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds"] Oct 04 03:26:40 crc kubenswrapper[4964]: E1004 03:26:40.361411 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c841f6-d2c2-4557-8f03-ede13fec6dc0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.361433 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c841f6-d2c2-4557-8f03-ede13fec6dc0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.361648 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="31c841f6-d2c2-4557-8f03-ede13fec6dc0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.366089 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.373486 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds"] Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.378253 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.382386 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.382592 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.382821 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.382980 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.383221 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.392122 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvwbn" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.392208 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.392445 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.421279 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-skbm8"] Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.525300 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.525354 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.525484 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qtpb\" (UniqueName: \"kubernetes.io/projected/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-kube-api-access-4qtpb\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.525575 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.525608 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.525799 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.525863 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.525908 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.526009 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.526056 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.526142 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.627850 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.627947 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qtpb\" (UniqueName: \"kubernetes.io/projected/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-kube-api-access-4qtpb\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.628001 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.628038 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.628087 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.628117 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.628145 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.628199 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.628224 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.628280 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.628342 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.629180 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.630243 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.634026 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.634078 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.634232 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.634944 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.635138 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.639307 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.640661 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.644217 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.657656 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qtpb\" (UniqueName: \"kubernetes.io/projected/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-kube-api-access-4qtpb\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:40 crc kubenswrapper[4964]: I1004 03:26:40.693563 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:26:41 crc kubenswrapper[4964]: I1004 03:26:41.246125 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds"] Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.279011 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-skbm8" podUID="5ed669bd-6425-4e55-9353-9d719b8bb328" containerName="registry-server" containerID="cri-o://96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3" gracePeriod=2 Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.279679 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" event={"ID":"2882ad3d-53fb-4ccf-aa3b-fe34165726a4","Type":"ContainerStarted","Data":"119e83a371ce7a08b731b8ed29f415a089939d53119cbd8db8a6482bfc026594"} Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.280144 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" event={"ID":"2882ad3d-53fb-4ccf-aa3b-fe34165726a4","Type":"ContainerStarted","Data":"36b9844b0f21e4e3dec9da6b8e6a6ad508eb480726d2bead0966940f4493fd8a"} Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.313405 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" podStartSLOduration=1.853944829 podStartE2EDuration="2.313375596s" podCreationTimestamp="2025-10-04 03:26:40 +0000 UTC" firstStartedPulling="2025-10-04 03:26:41.256808587 +0000 UTC m=+2781.153767235" lastFinishedPulling="2025-10-04 03:26:41.716239324 +0000 UTC m=+2781.613198002" observedRunningTime="2025-10-04 03:26:42.306449803 +0000 UTC m=+2782.203408461" watchObservedRunningTime="2025-10-04 03:26:42.313375596 +0000 UTC m=+2782.210334274" Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.723601 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.870049 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v4bl\" (UniqueName: \"kubernetes.io/projected/5ed669bd-6425-4e55-9353-9d719b8bb328-kube-api-access-7v4bl\") pod \"5ed669bd-6425-4e55-9353-9d719b8bb328\" (UID: \"5ed669bd-6425-4e55-9353-9d719b8bb328\") " Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.870142 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed669bd-6425-4e55-9353-9d719b8bb328-catalog-content\") pod \"5ed669bd-6425-4e55-9353-9d719b8bb328\" (UID: \"5ed669bd-6425-4e55-9353-9d719b8bb328\") " Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.870224 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed669bd-6425-4e55-9353-9d719b8bb328-utilities\") pod \"5ed669bd-6425-4e55-9353-9d719b8bb328\" (UID: \"5ed669bd-6425-4e55-9353-9d719b8bb328\") " Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.876070 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed669bd-6425-4e55-9353-9d719b8bb328-kube-api-access-7v4bl" (OuterVolumeSpecName: "kube-api-access-7v4bl") pod "5ed669bd-6425-4e55-9353-9d719b8bb328" (UID: "5ed669bd-6425-4e55-9353-9d719b8bb328"). InnerVolumeSpecName "kube-api-access-7v4bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.883114 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed669bd-6425-4e55-9353-9d719b8bb328-utilities" (OuterVolumeSpecName: "utilities") pod "5ed669bd-6425-4e55-9353-9d719b8bb328" (UID: "5ed669bd-6425-4e55-9353-9d719b8bb328"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.906747 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed669bd-6425-4e55-9353-9d719b8bb328-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ed669bd-6425-4e55-9353-9d719b8bb328" (UID: "5ed669bd-6425-4e55-9353-9d719b8bb328"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.974015 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v4bl\" (UniqueName: \"kubernetes.io/projected/5ed669bd-6425-4e55-9353-9d719b8bb328-kube-api-access-7v4bl\") on node \"crc\" DevicePath \"\"" Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.974072 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed669bd-6425-4e55-9353-9d719b8bb328-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:26:42 crc kubenswrapper[4964]: I1004 03:26:42.974092 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed669bd-6425-4e55-9353-9d719b8bb328-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.295410 4964 generic.go:334] "Generic (PLEG): container finished" podID="5ed669bd-6425-4e55-9353-9d719b8bb328" containerID="96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3" exitCode=0 Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.295482 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skbm8" Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.295500 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skbm8" event={"ID":"5ed669bd-6425-4e55-9353-9d719b8bb328","Type":"ContainerDied","Data":"96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3"} Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.295607 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skbm8" event={"ID":"5ed669bd-6425-4e55-9353-9d719b8bb328","Type":"ContainerDied","Data":"0ce3e681b82bf88828968198684d9be2765acacb6650be36d1ae08ebba9ec1e4"} Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.295683 4964 scope.go:117] "RemoveContainer" containerID="96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3" Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.332928 4964 scope.go:117] "RemoveContainer" containerID="e7db55e76a53e50fdbc3a1f5044ff426aa73057caffd2d67c177fb54ebff8253" Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.371668 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-skbm8"] Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.379585 4964 scope.go:117] "RemoveContainer" containerID="d3804b44daa6ced664961dede103e53a97158b811cfcbd22df3d0d2bbe52fe40" Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.394530 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-skbm8"] Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.432789 4964 scope.go:117] "RemoveContainer" containerID="96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3" Oct 04 03:26:43 crc kubenswrapper[4964]: E1004 03:26:43.433365 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3\": container with ID starting with 96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3 not found: ID does not exist" containerID="96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3" Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.433394 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3"} err="failed to get container status \"96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3\": rpc error: code = NotFound desc = could not find container \"96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3\": container with ID starting with 96b30ab47c5e09cb7fdaa151e3b2e62cbc27ba884a64f3bc02d8c0fc40c933b3 not found: ID does not exist" Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.433432 4964 scope.go:117] "RemoveContainer" containerID="e7db55e76a53e50fdbc3a1f5044ff426aa73057caffd2d67c177fb54ebff8253" Oct 04 03:26:43 crc kubenswrapper[4964]: E1004 03:26:43.433944 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7db55e76a53e50fdbc3a1f5044ff426aa73057caffd2d67c177fb54ebff8253\": container with ID starting with e7db55e76a53e50fdbc3a1f5044ff426aa73057caffd2d67c177fb54ebff8253 not found: ID does not exist" containerID="e7db55e76a53e50fdbc3a1f5044ff426aa73057caffd2d67c177fb54ebff8253" Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.433970 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7db55e76a53e50fdbc3a1f5044ff426aa73057caffd2d67c177fb54ebff8253"} err="failed to get container status \"e7db55e76a53e50fdbc3a1f5044ff426aa73057caffd2d67c177fb54ebff8253\": rpc error: code = NotFound desc = could not find container \"e7db55e76a53e50fdbc3a1f5044ff426aa73057caffd2d67c177fb54ebff8253\": container with ID starting with e7db55e76a53e50fdbc3a1f5044ff426aa73057caffd2d67c177fb54ebff8253 not found: ID does not exist" Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.433984 4964 scope.go:117] "RemoveContainer" containerID="d3804b44daa6ced664961dede103e53a97158b811cfcbd22df3d0d2bbe52fe40" Oct 04 03:26:43 crc kubenswrapper[4964]: E1004 03:26:43.434394 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3804b44daa6ced664961dede103e53a97158b811cfcbd22df3d0d2bbe52fe40\": container with ID starting with d3804b44daa6ced664961dede103e53a97158b811cfcbd22df3d0d2bbe52fe40 not found: ID does not exist" containerID="d3804b44daa6ced664961dede103e53a97158b811cfcbd22df3d0d2bbe52fe40" Oct 04 03:26:43 crc kubenswrapper[4964]: I1004 03:26:43.434413 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3804b44daa6ced664961dede103e53a97158b811cfcbd22df3d0d2bbe52fe40"} err="failed to get container status \"d3804b44daa6ced664961dede103e53a97158b811cfcbd22df3d0d2bbe52fe40\": rpc error: code = NotFound desc = could not find container \"d3804b44daa6ced664961dede103e53a97158b811cfcbd22df3d0d2bbe52fe40\": container with ID starting with d3804b44daa6ced664961dede103e53a97158b811cfcbd22df3d0d2bbe52fe40 not found: ID does not exist" Oct 04 03:26:43 crc kubenswrapper[4964]: E1004 03:26:43.516941 4964 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ed669bd_6425_4e55_9353_9d719b8bb328.slice/crio-0ce3e681b82bf88828968198684d9be2765acacb6650be36d1ae08ebba9ec1e4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ed669bd_6425_4e55_9353_9d719b8bb328.slice\": RecentStats: unable to find data in memory cache]" Oct 04 03:26:44 crc kubenswrapper[4964]: I1004 03:26:44.873442 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed669bd-6425-4e55-9353-9d719b8bb328" path="/var/lib/kubelet/pods/5ed669bd-6425-4e55-9353-9d719b8bb328/volumes" Oct 04 03:27:34 crc kubenswrapper[4964]: I1004 03:27:34.449257 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:27:34 crc kubenswrapper[4964]: I1004 03:27:34.450000 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:28:04 crc kubenswrapper[4964]: I1004 03:28:04.449066 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:28:04 crc kubenswrapper[4964]: I1004 03:28:04.449745 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:28:34 crc kubenswrapper[4964]: I1004 03:28:34.449222 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:28:34 crc kubenswrapper[4964]: I1004 03:28:34.450100 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:28:34 crc kubenswrapper[4964]: I1004 03:28:34.450168 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 03:28:34 crc kubenswrapper[4964]: I1004 03:28:34.451229 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 03:28:34 crc kubenswrapper[4964]: I1004 03:28:34.451334 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" gracePeriod=600 Oct 04 03:28:34 crc kubenswrapper[4964]: E1004 03:28:34.586654 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:28:35 crc kubenswrapper[4964]: I1004 03:28:35.553189 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" exitCode=0 Oct 04 03:28:35 crc kubenswrapper[4964]: I1004 03:28:35.553270 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d"} Oct 04 03:28:35 crc kubenswrapper[4964]: I1004 03:28:35.553580 4964 scope.go:117] "RemoveContainer" containerID="f6e655ee4d1c5aeb52f7fb388a42fbe1ff76f1dce97ec4eb0258e257864d16ea" Oct 04 03:28:35 crc kubenswrapper[4964]: I1004 03:28:35.555022 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:28:35 crc kubenswrapper[4964]: E1004 03:28:35.555499 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:28:47 crc kubenswrapper[4964]: I1004 03:28:47.845578 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:28:47 crc kubenswrapper[4964]: E1004 03:28:47.846485 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:29:00 crc kubenswrapper[4964]: I1004 03:29:00.858582 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:29:00 crc kubenswrapper[4964]: E1004 03:29:00.859949 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:29:13 crc kubenswrapper[4964]: I1004 03:29:13.845293 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:29:13 crc kubenswrapper[4964]: E1004 03:29:13.845914 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:29:28 crc kubenswrapper[4964]: I1004 03:29:28.845861 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:29:28 crc kubenswrapper[4964]: E1004 03:29:28.847078 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:29:34 crc kubenswrapper[4964]: I1004 03:29:34.810868 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h6tln"] Oct 04 03:29:34 crc kubenswrapper[4964]: E1004 03:29:34.811803 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed669bd-6425-4e55-9353-9d719b8bb328" containerName="registry-server" Oct 04 03:29:34 crc kubenswrapper[4964]: I1004 03:29:34.811819 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed669bd-6425-4e55-9353-9d719b8bb328" containerName="registry-server" Oct 04 03:29:34 crc kubenswrapper[4964]: E1004 03:29:34.811837 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed669bd-6425-4e55-9353-9d719b8bb328" containerName="extract-content" Oct 04 03:29:34 crc kubenswrapper[4964]: I1004 03:29:34.811845 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed669bd-6425-4e55-9353-9d719b8bb328" containerName="extract-content" Oct 04 03:29:34 crc kubenswrapper[4964]: E1004 03:29:34.811860 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed669bd-6425-4e55-9353-9d719b8bb328" containerName="extract-utilities" Oct 04 03:29:34 crc kubenswrapper[4964]: I1004 03:29:34.811869 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed669bd-6425-4e55-9353-9d719b8bb328" containerName="extract-utilities" Oct 04 03:29:34 crc kubenswrapper[4964]: I1004 03:29:34.812075 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed669bd-6425-4e55-9353-9d719b8bb328" containerName="registry-server" Oct 04 03:29:34 crc kubenswrapper[4964]: I1004 03:29:34.813665 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:34 crc kubenswrapper[4964]: I1004 03:29:34.826735 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6tln"] Oct 04 03:29:34 crc kubenswrapper[4964]: I1004 03:29:34.918773 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc393da1-f784-4ab5-9563-6ff29b13e955-catalog-content\") pod \"community-operators-h6tln\" (UID: \"bc393da1-f784-4ab5-9563-6ff29b13e955\") " pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:34 crc kubenswrapper[4964]: I1004 03:29:34.918898 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc393da1-f784-4ab5-9563-6ff29b13e955-utilities\") pod \"community-operators-h6tln\" (UID: \"bc393da1-f784-4ab5-9563-6ff29b13e955\") " pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:34 crc kubenswrapper[4964]: I1004 03:29:34.919190 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcrl7\" (UniqueName: \"kubernetes.io/projected/bc393da1-f784-4ab5-9563-6ff29b13e955-kube-api-access-rcrl7\") pod \"community-operators-h6tln\" (UID: \"bc393da1-f784-4ab5-9563-6ff29b13e955\") " pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:35 crc kubenswrapper[4964]: I1004 03:29:35.021792 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc393da1-f784-4ab5-9563-6ff29b13e955-catalog-content\") pod \"community-operators-h6tln\" (UID: \"bc393da1-f784-4ab5-9563-6ff29b13e955\") " pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:35 crc kubenswrapper[4964]: I1004 03:29:35.021971 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc393da1-f784-4ab5-9563-6ff29b13e955-utilities\") pod \"community-operators-h6tln\" (UID: \"bc393da1-f784-4ab5-9563-6ff29b13e955\") " pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:35 crc kubenswrapper[4964]: I1004 03:29:35.022002 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcrl7\" (UniqueName: \"kubernetes.io/projected/bc393da1-f784-4ab5-9563-6ff29b13e955-kube-api-access-rcrl7\") pod \"community-operators-h6tln\" (UID: \"bc393da1-f784-4ab5-9563-6ff29b13e955\") " pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:35 crc kubenswrapper[4964]: I1004 03:29:35.022416 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc393da1-f784-4ab5-9563-6ff29b13e955-catalog-content\") pod \"community-operators-h6tln\" (UID: \"bc393da1-f784-4ab5-9563-6ff29b13e955\") " pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:35 crc kubenswrapper[4964]: I1004 03:29:35.022716 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc393da1-f784-4ab5-9563-6ff29b13e955-utilities\") pod \"community-operators-h6tln\" (UID: \"bc393da1-f784-4ab5-9563-6ff29b13e955\") " pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:35 crc kubenswrapper[4964]: I1004 03:29:35.053058 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcrl7\" (UniqueName: \"kubernetes.io/projected/bc393da1-f784-4ab5-9563-6ff29b13e955-kube-api-access-rcrl7\") pod \"community-operators-h6tln\" (UID: \"bc393da1-f784-4ab5-9563-6ff29b13e955\") " pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:35 crc kubenswrapper[4964]: I1004 03:29:35.154031 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:35 crc kubenswrapper[4964]: I1004 03:29:35.659762 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6tln"] Oct 04 03:29:35 crc kubenswrapper[4964]: W1004 03:29:35.661220 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc393da1_f784_4ab5_9563_6ff29b13e955.slice/crio-7c56d35a0c362113dbc3ca4a7ab8d88d9eba8b628c2628702cd218a40a8db3c4 WatchSource:0}: Error finding container 7c56d35a0c362113dbc3ca4a7ab8d88d9eba8b628c2628702cd218a40a8db3c4: Status 404 returned error can't find the container with id 7c56d35a0c362113dbc3ca4a7ab8d88d9eba8b628c2628702cd218a40a8db3c4 Oct 04 03:29:36 crc kubenswrapper[4964]: I1004 03:29:36.183677 4964 generic.go:334] "Generic (PLEG): container finished" podID="bc393da1-f784-4ab5-9563-6ff29b13e955" containerID="b4b26e4396b584b57f29612474be89d406b4f2686633cfc3eccca5fc4e08197a" exitCode=0 Oct 04 03:29:36 crc kubenswrapper[4964]: I1004 03:29:36.183785 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6tln" event={"ID":"bc393da1-f784-4ab5-9563-6ff29b13e955","Type":"ContainerDied","Data":"b4b26e4396b584b57f29612474be89d406b4f2686633cfc3eccca5fc4e08197a"} Oct 04 03:29:36 crc kubenswrapper[4964]: I1004 03:29:36.184060 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6tln" event={"ID":"bc393da1-f784-4ab5-9563-6ff29b13e955","Type":"ContainerStarted","Data":"7c56d35a0c362113dbc3ca4a7ab8d88d9eba8b628c2628702cd218a40a8db3c4"} Oct 04 03:29:38 crc kubenswrapper[4964]: I1004 03:29:38.206163 4964 generic.go:334] "Generic (PLEG): container finished" podID="bc393da1-f784-4ab5-9563-6ff29b13e955" containerID="28332b14007867d4976e9ee19d056b6dc552ba39c8e0167590ca4f35a28f0e52" exitCode=0 Oct 04 03:29:38 crc kubenswrapper[4964]: I1004 03:29:38.206242 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6tln" event={"ID":"bc393da1-f784-4ab5-9563-6ff29b13e955","Type":"ContainerDied","Data":"28332b14007867d4976e9ee19d056b6dc552ba39c8e0167590ca4f35a28f0e52"} Oct 04 03:29:39 crc kubenswrapper[4964]: I1004 03:29:39.222521 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6tln" event={"ID":"bc393da1-f784-4ab5-9563-6ff29b13e955","Type":"ContainerStarted","Data":"ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290"} Oct 04 03:29:39 crc kubenswrapper[4964]: I1004 03:29:39.255351 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h6tln" podStartSLOduration=2.756824503 podStartE2EDuration="5.255321157s" podCreationTimestamp="2025-10-04 03:29:34 +0000 UTC" firstStartedPulling="2025-10-04 03:29:36.186844557 +0000 UTC m=+2956.083803195" lastFinishedPulling="2025-10-04 03:29:38.685341181 +0000 UTC m=+2958.582299849" observedRunningTime="2025-10-04 03:29:39.243893193 +0000 UTC m=+2959.140851871" watchObservedRunningTime="2025-10-04 03:29:39.255321157 +0000 UTC m=+2959.152279825" Oct 04 03:29:41 crc kubenswrapper[4964]: I1004 03:29:41.846172 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:29:41 crc kubenswrapper[4964]: E1004 03:29:41.847004 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:29:44 crc kubenswrapper[4964]: I1004 03:29:44.761562 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wqc76"] Oct 04 03:29:44 crc kubenswrapper[4964]: I1004 03:29:44.772875 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:44 crc kubenswrapper[4964]: I1004 03:29:44.779506 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wqc76"] Oct 04 03:29:44 crc kubenswrapper[4964]: I1004 03:29:44.855748 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13900d51-c685-40ef-b17f-51b90860a7fb-catalog-content\") pod \"redhat-operators-wqc76\" (UID: \"13900d51-c685-40ef-b17f-51b90860a7fb\") " pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:44 crc kubenswrapper[4964]: I1004 03:29:44.856068 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crd9v\" (UniqueName: \"kubernetes.io/projected/13900d51-c685-40ef-b17f-51b90860a7fb-kube-api-access-crd9v\") pod \"redhat-operators-wqc76\" (UID: \"13900d51-c685-40ef-b17f-51b90860a7fb\") " pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:44 crc kubenswrapper[4964]: I1004 03:29:44.856159 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13900d51-c685-40ef-b17f-51b90860a7fb-utilities\") pod \"redhat-operators-wqc76\" (UID: \"13900d51-c685-40ef-b17f-51b90860a7fb\") " pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:44 crc kubenswrapper[4964]: I1004 03:29:44.958177 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crd9v\" (UniqueName: \"kubernetes.io/projected/13900d51-c685-40ef-b17f-51b90860a7fb-kube-api-access-crd9v\") pod \"redhat-operators-wqc76\" (UID: \"13900d51-c685-40ef-b17f-51b90860a7fb\") " pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:44 crc kubenswrapper[4964]: I1004 03:29:44.958249 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13900d51-c685-40ef-b17f-51b90860a7fb-utilities\") pod \"redhat-operators-wqc76\" (UID: \"13900d51-c685-40ef-b17f-51b90860a7fb\") " pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:44 crc kubenswrapper[4964]: I1004 03:29:44.958360 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13900d51-c685-40ef-b17f-51b90860a7fb-catalog-content\") pod \"redhat-operators-wqc76\" (UID: \"13900d51-c685-40ef-b17f-51b90860a7fb\") " pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:44 crc kubenswrapper[4964]: I1004 03:29:44.959146 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13900d51-c685-40ef-b17f-51b90860a7fb-utilities\") pod \"redhat-operators-wqc76\" (UID: \"13900d51-c685-40ef-b17f-51b90860a7fb\") " pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:44 crc kubenswrapper[4964]: I1004 03:29:44.959183 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13900d51-c685-40ef-b17f-51b90860a7fb-catalog-content\") pod \"redhat-operators-wqc76\" (UID: \"13900d51-c685-40ef-b17f-51b90860a7fb\") " pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:44 crc kubenswrapper[4964]: I1004 03:29:44.978706 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crd9v\" (UniqueName: \"kubernetes.io/projected/13900d51-c685-40ef-b17f-51b90860a7fb-kube-api-access-crd9v\") pod \"redhat-operators-wqc76\" (UID: \"13900d51-c685-40ef-b17f-51b90860a7fb\") " pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:45 crc kubenswrapper[4964]: I1004 03:29:45.116849 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:45 crc kubenswrapper[4964]: I1004 03:29:45.155078 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:45 crc kubenswrapper[4964]: I1004 03:29:45.155112 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:45 crc kubenswrapper[4964]: I1004 03:29:45.231533 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:45 crc kubenswrapper[4964]: I1004 03:29:45.350155 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:45 crc kubenswrapper[4964]: I1004 03:29:45.603932 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wqc76"] Oct 04 03:29:46 crc kubenswrapper[4964]: I1004 03:29:46.301602 4964 generic.go:334] "Generic (PLEG): container finished" podID="13900d51-c685-40ef-b17f-51b90860a7fb" containerID="346554116c6a18b25f82fcdc72526d43e69096547f1cd9fa9f299141f9bfc778" exitCode=0 Oct 04 03:29:46 crc kubenswrapper[4964]: I1004 03:29:46.302747 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqc76" event={"ID":"13900d51-c685-40ef-b17f-51b90860a7fb","Type":"ContainerDied","Data":"346554116c6a18b25f82fcdc72526d43e69096547f1cd9fa9f299141f9bfc778"} Oct 04 03:29:46 crc kubenswrapper[4964]: I1004 03:29:46.302794 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqc76" event={"ID":"13900d51-c685-40ef-b17f-51b90860a7fb","Type":"ContainerStarted","Data":"279dc43fec8fb13a70593f178551236ebb3cd9a595dc1df773863e4591b4ce7c"} Oct 04 03:29:47 crc kubenswrapper[4964]: I1004 03:29:47.496283 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6tln"] Oct 04 03:29:47 crc kubenswrapper[4964]: I1004 03:29:47.496953 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h6tln" podUID="bc393da1-f784-4ab5-9563-6ff29b13e955" containerName="registry-server" containerID="cri-o://ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290" gracePeriod=2 Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.090415 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.253178 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc393da1-f784-4ab5-9563-6ff29b13e955-utilities\") pod \"bc393da1-f784-4ab5-9563-6ff29b13e955\" (UID: \"bc393da1-f784-4ab5-9563-6ff29b13e955\") " Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.253243 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcrl7\" (UniqueName: \"kubernetes.io/projected/bc393da1-f784-4ab5-9563-6ff29b13e955-kube-api-access-rcrl7\") pod \"bc393da1-f784-4ab5-9563-6ff29b13e955\" (UID: \"bc393da1-f784-4ab5-9563-6ff29b13e955\") " Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.253303 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc393da1-f784-4ab5-9563-6ff29b13e955-catalog-content\") pod \"bc393da1-f784-4ab5-9563-6ff29b13e955\" (UID: \"bc393da1-f784-4ab5-9563-6ff29b13e955\") " Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.253975 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc393da1-f784-4ab5-9563-6ff29b13e955-utilities" (OuterVolumeSpecName: "utilities") pod "bc393da1-f784-4ab5-9563-6ff29b13e955" (UID: "bc393da1-f784-4ab5-9563-6ff29b13e955"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.259829 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc393da1-f784-4ab5-9563-6ff29b13e955-kube-api-access-rcrl7" (OuterVolumeSpecName: "kube-api-access-rcrl7") pod "bc393da1-f784-4ab5-9563-6ff29b13e955" (UID: "bc393da1-f784-4ab5-9563-6ff29b13e955"). InnerVolumeSpecName "kube-api-access-rcrl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.298156 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc393da1-f784-4ab5-9563-6ff29b13e955-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc393da1-f784-4ab5-9563-6ff29b13e955" (UID: "bc393da1-f784-4ab5-9563-6ff29b13e955"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.322369 4964 generic.go:334] "Generic (PLEG): container finished" podID="13900d51-c685-40ef-b17f-51b90860a7fb" containerID="e9d735bc6f2a4f0a381090e3d87eaec8976e2a96ecaf68a475a585846625ce80" exitCode=0 Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.322471 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqc76" event={"ID":"13900d51-c685-40ef-b17f-51b90860a7fb","Type":"ContainerDied","Data":"e9d735bc6f2a4f0a381090e3d87eaec8976e2a96ecaf68a475a585846625ce80"} Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.325757 4964 generic.go:334] "Generic (PLEG): container finished" podID="bc393da1-f784-4ab5-9563-6ff29b13e955" containerID="ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290" exitCode=0 Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.325792 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6tln" event={"ID":"bc393da1-f784-4ab5-9563-6ff29b13e955","Type":"ContainerDied","Data":"ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290"} Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.325805 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6tln" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.325813 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6tln" event={"ID":"bc393da1-f784-4ab5-9563-6ff29b13e955","Type":"ContainerDied","Data":"7c56d35a0c362113dbc3ca4a7ab8d88d9eba8b628c2628702cd218a40a8db3c4"} Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.325834 4964 scope.go:117] "RemoveContainer" containerID="ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.347533 4964 scope.go:117] "RemoveContainer" containerID="28332b14007867d4976e9ee19d056b6dc552ba39c8e0167590ca4f35a28f0e52" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.356559 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc393da1-f784-4ab5-9563-6ff29b13e955-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.356597 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcrl7\" (UniqueName: \"kubernetes.io/projected/bc393da1-f784-4ab5-9563-6ff29b13e955-kube-api-access-rcrl7\") on node \"crc\" DevicePath \"\"" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.356633 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc393da1-f784-4ab5-9563-6ff29b13e955-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.369842 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6tln"] Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.381358 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h6tln"] Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.390326 4964 scope.go:117] "RemoveContainer" containerID="b4b26e4396b584b57f29612474be89d406b4f2686633cfc3eccca5fc4e08197a" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.429682 4964 scope.go:117] "RemoveContainer" containerID="ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290" Oct 04 03:29:48 crc kubenswrapper[4964]: E1004 03:29:48.430114 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290\": container with ID starting with ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290 not found: ID does not exist" containerID="ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.430148 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290"} err="failed to get container status \"ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290\": rpc error: code = NotFound desc = could not find container \"ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290\": container with ID starting with ae2f2b27c323f5d0bdfcd51ad209188ccaca7f8cfd9020d166e6b1ebd95c9290 not found: ID does not exist" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.430173 4964 scope.go:117] "RemoveContainer" containerID="28332b14007867d4976e9ee19d056b6dc552ba39c8e0167590ca4f35a28f0e52" Oct 04 03:29:48 crc kubenswrapper[4964]: E1004 03:29:48.430670 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28332b14007867d4976e9ee19d056b6dc552ba39c8e0167590ca4f35a28f0e52\": container with ID starting with 28332b14007867d4976e9ee19d056b6dc552ba39c8e0167590ca4f35a28f0e52 not found: ID does not exist" containerID="28332b14007867d4976e9ee19d056b6dc552ba39c8e0167590ca4f35a28f0e52" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.430693 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28332b14007867d4976e9ee19d056b6dc552ba39c8e0167590ca4f35a28f0e52"} err="failed to get container status \"28332b14007867d4976e9ee19d056b6dc552ba39c8e0167590ca4f35a28f0e52\": rpc error: code = NotFound desc = could not find container \"28332b14007867d4976e9ee19d056b6dc552ba39c8e0167590ca4f35a28f0e52\": container with ID starting with 28332b14007867d4976e9ee19d056b6dc552ba39c8e0167590ca4f35a28f0e52 not found: ID does not exist" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.430708 4964 scope.go:117] "RemoveContainer" containerID="b4b26e4396b584b57f29612474be89d406b4f2686633cfc3eccca5fc4e08197a" Oct 04 03:29:48 crc kubenswrapper[4964]: E1004 03:29:48.430944 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b26e4396b584b57f29612474be89d406b4f2686633cfc3eccca5fc4e08197a\": container with ID starting with b4b26e4396b584b57f29612474be89d406b4f2686633cfc3eccca5fc4e08197a not found: ID does not exist" containerID="b4b26e4396b584b57f29612474be89d406b4f2686633cfc3eccca5fc4e08197a" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.430968 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b26e4396b584b57f29612474be89d406b4f2686633cfc3eccca5fc4e08197a"} err="failed to get container status \"b4b26e4396b584b57f29612474be89d406b4f2686633cfc3eccca5fc4e08197a\": rpc error: code = NotFound desc = could not find container \"b4b26e4396b584b57f29612474be89d406b4f2686633cfc3eccca5fc4e08197a\": container with ID starting with b4b26e4396b584b57f29612474be89d406b4f2686633cfc3eccca5fc4e08197a not found: ID does not exist" Oct 04 03:29:48 crc kubenswrapper[4964]: I1004 03:29:48.867505 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc393da1-f784-4ab5-9563-6ff29b13e955" path="/var/lib/kubelet/pods/bc393da1-f784-4ab5-9563-6ff29b13e955/volumes" Oct 04 03:29:49 crc kubenswrapper[4964]: I1004 03:29:49.337981 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqc76" event={"ID":"13900d51-c685-40ef-b17f-51b90860a7fb","Type":"ContainerStarted","Data":"85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c"} Oct 04 03:29:49 crc kubenswrapper[4964]: I1004 03:29:49.374305 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wqc76" podStartSLOduration=2.938604686 podStartE2EDuration="5.374286s" podCreationTimestamp="2025-10-04 03:29:44 +0000 UTC" firstStartedPulling="2025-10-04 03:29:46.30470125 +0000 UTC m=+2966.201659898" lastFinishedPulling="2025-10-04 03:29:48.740382544 +0000 UTC m=+2968.637341212" observedRunningTime="2025-10-04 03:29:49.36675759 +0000 UTC m=+2969.263716308" watchObservedRunningTime="2025-10-04 03:29:49.374286 +0000 UTC m=+2969.271244648" Oct 04 03:29:55 crc kubenswrapper[4964]: I1004 03:29:55.117681 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:55 crc kubenswrapper[4964]: I1004 03:29:55.118444 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:55 crc kubenswrapper[4964]: I1004 03:29:55.197585 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:55 crc kubenswrapper[4964]: I1004 03:29:55.465559 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:55 crc kubenswrapper[4964]: I1004 03:29:55.541982 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wqc76"] Oct 04 03:29:56 crc kubenswrapper[4964]: I1004 03:29:56.846718 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:29:56 crc kubenswrapper[4964]: E1004 03:29:56.847431 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:29:57 crc kubenswrapper[4964]: I1004 03:29:57.431206 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wqc76" podUID="13900d51-c685-40ef-b17f-51b90860a7fb" containerName="registry-server" containerID="cri-o://85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c" gracePeriod=2 Oct 04 03:29:57 crc kubenswrapper[4964]: I1004 03:29:57.941609 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:57 crc kubenswrapper[4964]: I1004 03:29:57.996646 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13900d51-c685-40ef-b17f-51b90860a7fb-utilities\") pod \"13900d51-c685-40ef-b17f-51b90860a7fb\" (UID: \"13900d51-c685-40ef-b17f-51b90860a7fb\") " Oct 04 03:29:57 crc kubenswrapper[4964]: I1004 03:29:57.998154 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13900d51-c685-40ef-b17f-51b90860a7fb-catalog-content\") pod \"13900d51-c685-40ef-b17f-51b90860a7fb\" (UID: \"13900d51-c685-40ef-b17f-51b90860a7fb\") " Oct 04 03:29:57 crc kubenswrapper[4964]: I1004 03:29:57.998774 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13900d51-c685-40ef-b17f-51b90860a7fb-utilities" (OuterVolumeSpecName: "utilities") pod "13900d51-c685-40ef-b17f-51b90860a7fb" (UID: "13900d51-c685-40ef-b17f-51b90860a7fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:29:57 crc kubenswrapper[4964]: I1004 03:29:57.998870 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crd9v\" (UniqueName: \"kubernetes.io/projected/13900d51-c685-40ef-b17f-51b90860a7fb-kube-api-access-crd9v\") pod \"13900d51-c685-40ef-b17f-51b90860a7fb\" (UID: \"13900d51-c685-40ef-b17f-51b90860a7fb\") " Oct 04 03:29:57 crc kubenswrapper[4964]: I1004 03:29:57.999538 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13900d51-c685-40ef-b17f-51b90860a7fb-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.012761 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13900d51-c685-40ef-b17f-51b90860a7fb-kube-api-access-crd9v" (OuterVolumeSpecName: "kube-api-access-crd9v") pod "13900d51-c685-40ef-b17f-51b90860a7fb" (UID: "13900d51-c685-40ef-b17f-51b90860a7fb"). InnerVolumeSpecName "kube-api-access-crd9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.100884 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crd9v\" (UniqueName: \"kubernetes.io/projected/13900d51-c685-40ef-b17f-51b90860a7fb-kube-api-access-crd9v\") on node \"crc\" DevicePath \"\"" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.103957 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13900d51-c685-40ef-b17f-51b90860a7fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13900d51-c685-40ef-b17f-51b90860a7fb" (UID: "13900d51-c685-40ef-b17f-51b90860a7fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.204055 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13900d51-c685-40ef-b17f-51b90860a7fb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.458506 4964 generic.go:334] "Generic (PLEG): container finished" podID="13900d51-c685-40ef-b17f-51b90860a7fb" containerID="85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c" exitCode=0 Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.458574 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqc76" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.458572 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqc76" event={"ID":"13900d51-c685-40ef-b17f-51b90860a7fb","Type":"ContainerDied","Data":"85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c"} Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.458694 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqc76" event={"ID":"13900d51-c685-40ef-b17f-51b90860a7fb","Type":"ContainerDied","Data":"279dc43fec8fb13a70593f178551236ebb3cd9a595dc1df773863e4591b4ce7c"} Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.458758 4964 scope.go:117] "RemoveContainer" containerID="85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.492060 4964 scope.go:117] "RemoveContainer" containerID="e9d735bc6f2a4f0a381090e3d87eaec8976e2a96ecaf68a475a585846625ce80" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.498701 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wqc76"] Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.507111 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wqc76"] Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.519647 4964 scope.go:117] "RemoveContainer" containerID="346554116c6a18b25f82fcdc72526d43e69096547f1cd9fa9f299141f9bfc778" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.580559 4964 scope.go:117] "RemoveContainer" containerID="85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c" Oct 04 03:29:58 crc kubenswrapper[4964]: E1004 03:29:58.581122 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c\": container with ID starting with 85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c not found: ID does not exist" containerID="85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.581242 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c"} err="failed to get container status \"85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c\": rpc error: code = NotFound desc = could not find container \"85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c\": container with ID starting with 85bd7ff20776bd129a2c22ad173c2b8bf56926db0c300eceb7d4806ebd70410c not found: ID does not exist" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.581343 4964 scope.go:117] "RemoveContainer" containerID="e9d735bc6f2a4f0a381090e3d87eaec8976e2a96ecaf68a475a585846625ce80" Oct 04 03:29:58 crc kubenswrapper[4964]: E1004 03:29:58.582018 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d735bc6f2a4f0a381090e3d87eaec8976e2a96ecaf68a475a585846625ce80\": container with ID starting with e9d735bc6f2a4f0a381090e3d87eaec8976e2a96ecaf68a475a585846625ce80 not found: ID does not exist" containerID="e9d735bc6f2a4f0a381090e3d87eaec8976e2a96ecaf68a475a585846625ce80" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.582078 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d735bc6f2a4f0a381090e3d87eaec8976e2a96ecaf68a475a585846625ce80"} err="failed to get container status \"e9d735bc6f2a4f0a381090e3d87eaec8976e2a96ecaf68a475a585846625ce80\": rpc error: code = NotFound desc = could not find container \"e9d735bc6f2a4f0a381090e3d87eaec8976e2a96ecaf68a475a585846625ce80\": container with ID starting with e9d735bc6f2a4f0a381090e3d87eaec8976e2a96ecaf68a475a585846625ce80 not found: ID does not exist" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.582119 4964 scope.go:117] "RemoveContainer" containerID="346554116c6a18b25f82fcdc72526d43e69096547f1cd9fa9f299141f9bfc778" Oct 04 03:29:58 crc kubenswrapper[4964]: E1004 03:29:58.582528 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346554116c6a18b25f82fcdc72526d43e69096547f1cd9fa9f299141f9bfc778\": container with ID starting with 346554116c6a18b25f82fcdc72526d43e69096547f1cd9fa9f299141f9bfc778 not found: ID does not exist" containerID="346554116c6a18b25f82fcdc72526d43e69096547f1cd9fa9f299141f9bfc778" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.582663 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346554116c6a18b25f82fcdc72526d43e69096547f1cd9fa9f299141f9bfc778"} err="failed to get container status \"346554116c6a18b25f82fcdc72526d43e69096547f1cd9fa9f299141f9bfc778\": rpc error: code = NotFound desc = could not find container \"346554116c6a18b25f82fcdc72526d43e69096547f1cd9fa9f299141f9bfc778\": container with ID starting with 346554116c6a18b25f82fcdc72526d43e69096547f1cd9fa9f299141f9bfc778 not found: ID does not exist" Oct 04 03:29:58 crc kubenswrapper[4964]: I1004 03:29:58.863537 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13900d51-c685-40ef-b17f-51b90860a7fb" path="/var/lib/kubelet/pods/13900d51-c685-40ef-b17f-51b90860a7fb/volumes" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.207025 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl"] Oct 04 03:30:00 crc kubenswrapper[4964]: E1004 03:30:00.207739 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13900d51-c685-40ef-b17f-51b90860a7fb" containerName="extract-content" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.207756 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="13900d51-c685-40ef-b17f-51b90860a7fb" containerName="extract-content" Oct 04 03:30:00 crc kubenswrapper[4964]: E1004 03:30:00.207767 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc393da1-f784-4ab5-9563-6ff29b13e955" containerName="extract-utilities" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.207774 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc393da1-f784-4ab5-9563-6ff29b13e955" containerName="extract-utilities" Oct 04 03:30:00 crc kubenswrapper[4964]: E1004 03:30:00.207790 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc393da1-f784-4ab5-9563-6ff29b13e955" containerName="extract-content" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.207797 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc393da1-f784-4ab5-9563-6ff29b13e955" containerName="extract-content" Oct 04 03:30:00 crc kubenswrapper[4964]: E1004 03:30:00.207811 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc393da1-f784-4ab5-9563-6ff29b13e955" containerName="registry-server" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.209985 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc393da1-f784-4ab5-9563-6ff29b13e955" containerName="registry-server" Oct 04 03:30:00 crc kubenswrapper[4964]: E1004 03:30:00.210037 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13900d51-c685-40ef-b17f-51b90860a7fb" containerName="registry-server" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.210047 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="13900d51-c685-40ef-b17f-51b90860a7fb" containerName="registry-server" Oct 04 03:30:00 crc kubenswrapper[4964]: E1004 03:30:00.210061 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13900d51-c685-40ef-b17f-51b90860a7fb" containerName="extract-utilities" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.210069 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="13900d51-c685-40ef-b17f-51b90860a7fb" containerName="extract-utilities" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.211817 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc393da1-f784-4ab5-9563-6ff29b13e955" containerName="registry-server" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.211846 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="13900d51-c685-40ef-b17f-51b90860a7fb" containerName="registry-server" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.212607 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.216379 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.216648 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.233512 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl"] Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.249162 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acbb1675-5ef4-4649-a948-640d25df34b1-config-volume\") pod \"collect-profiles-29325810-p7mrl\" (UID: \"acbb1675-5ef4-4649-a948-640d25df34b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.249323 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acbb1675-5ef4-4649-a948-640d25df34b1-secret-volume\") pod \"collect-profiles-29325810-p7mrl\" (UID: \"acbb1675-5ef4-4649-a948-640d25df34b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.249659 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbcvw\" (UniqueName: \"kubernetes.io/projected/acbb1675-5ef4-4649-a948-640d25df34b1-kube-api-access-pbcvw\") pod \"collect-profiles-29325810-p7mrl\" (UID: \"acbb1675-5ef4-4649-a948-640d25df34b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.350335 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbcvw\" (UniqueName: \"kubernetes.io/projected/acbb1675-5ef4-4649-a948-640d25df34b1-kube-api-access-pbcvw\") pod \"collect-profiles-29325810-p7mrl\" (UID: \"acbb1675-5ef4-4649-a948-640d25df34b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.350416 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acbb1675-5ef4-4649-a948-640d25df34b1-config-volume\") pod \"collect-profiles-29325810-p7mrl\" (UID: \"acbb1675-5ef4-4649-a948-640d25df34b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.350467 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acbb1675-5ef4-4649-a948-640d25df34b1-secret-volume\") pod \"collect-profiles-29325810-p7mrl\" (UID: \"acbb1675-5ef4-4649-a948-640d25df34b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.351577 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acbb1675-5ef4-4649-a948-640d25df34b1-config-volume\") pod \"collect-profiles-29325810-p7mrl\" (UID: \"acbb1675-5ef4-4649-a948-640d25df34b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.355917 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acbb1675-5ef4-4649-a948-640d25df34b1-secret-volume\") pod \"collect-profiles-29325810-p7mrl\" (UID: \"acbb1675-5ef4-4649-a948-640d25df34b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.366334 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbcvw\" (UniqueName: \"kubernetes.io/projected/acbb1675-5ef4-4649-a948-640d25df34b1-kube-api-access-pbcvw\") pod \"collect-profiles-29325810-p7mrl\" (UID: \"acbb1675-5ef4-4649-a948-640d25df34b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.534176 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:00 crc kubenswrapper[4964]: I1004 03:30:00.999085 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl"] Oct 04 03:30:01 crc kubenswrapper[4964]: I1004 03:30:01.492315 4964 generic.go:334] "Generic (PLEG): container finished" podID="acbb1675-5ef4-4649-a948-640d25df34b1" containerID="c8340c2e535b804cf6bcd1f40b2efeda9b269134019202465678228c59a6a980" exitCode=0 Oct 04 03:30:01 crc kubenswrapper[4964]: I1004 03:30:01.492429 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" event={"ID":"acbb1675-5ef4-4649-a948-640d25df34b1","Type":"ContainerDied","Data":"c8340c2e535b804cf6bcd1f40b2efeda9b269134019202465678228c59a6a980"} Oct 04 03:30:01 crc kubenswrapper[4964]: I1004 03:30:01.492791 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" event={"ID":"acbb1675-5ef4-4649-a948-640d25df34b1","Type":"ContainerStarted","Data":"44a94d7ec35027b83e511734225c53ab9413562833af194fa132ffd5f7e18739"} Oct 04 03:30:02 crc kubenswrapper[4964]: I1004 03:30:02.979258 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:03 crc kubenswrapper[4964]: I1004 03:30:03.105273 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbcvw\" (UniqueName: \"kubernetes.io/projected/acbb1675-5ef4-4649-a948-640d25df34b1-kube-api-access-pbcvw\") pod \"acbb1675-5ef4-4649-a948-640d25df34b1\" (UID: \"acbb1675-5ef4-4649-a948-640d25df34b1\") " Oct 04 03:30:03 crc kubenswrapper[4964]: I1004 03:30:03.105341 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acbb1675-5ef4-4649-a948-640d25df34b1-config-volume\") pod \"acbb1675-5ef4-4649-a948-640d25df34b1\" (UID: \"acbb1675-5ef4-4649-a948-640d25df34b1\") " Oct 04 03:30:03 crc kubenswrapper[4964]: I1004 03:30:03.105403 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acbb1675-5ef4-4649-a948-640d25df34b1-secret-volume\") pod \"acbb1675-5ef4-4649-a948-640d25df34b1\" (UID: \"acbb1675-5ef4-4649-a948-640d25df34b1\") " Oct 04 03:30:03 crc kubenswrapper[4964]: I1004 03:30:03.106009 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acbb1675-5ef4-4649-a948-640d25df34b1-config-volume" (OuterVolumeSpecName: "config-volume") pod "acbb1675-5ef4-4649-a948-640d25df34b1" (UID: "acbb1675-5ef4-4649-a948-640d25df34b1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:30:03 crc kubenswrapper[4964]: I1004 03:30:03.111231 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbb1675-5ef4-4649-a948-640d25df34b1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "acbb1675-5ef4-4649-a948-640d25df34b1" (UID: "acbb1675-5ef4-4649-a948-640d25df34b1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:30:03 crc kubenswrapper[4964]: I1004 03:30:03.111911 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acbb1675-5ef4-4649-a948-640d25df34b1-kube-api-access-pbcvw" (OuterVolumeSpecName: "kube-api-access-pbcvw") pod "acbb1675-5ef4-4649-a948-640d25df34b1" (UID: "acbb1675-5ef4-4649-a948-640d25df34b1"). InnerVolumeSpecName "kube-api-access-pbcvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:30:03 crc kubenswrapper[4964]: I1004 03:30:03.207532 4964 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acbb1675-5ef4-4649-a948-640d25df34b1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:03 crc kubenswrapper[4964]: I1004 03:30:03.207576 4964 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acbb1675-5ef4-4649-a948-640d25df34b1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:03 crc kubenswrapper[4964]: I1004 03:30:03.207591 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbcvw\" (UniqueName: \"kubernetes.io/projected/acbb1675-5ef4-4649-a948-640d25df34b1-kube-api-access-pbcvw\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:03 crc kubenswrapper[4964]: I1004 03:30:03.522264 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" event={"ID":"acbb1675-5ef4-4649-a948-640d25df34b1","Type":"ContainerDied","Data":"44a94d7ec35027b83e511734225c53ab9413562833af194fa132ffd5f7e18739"} Oct 04 03:30:03 crc kubenswrapper[4964]: I1004 03:30:03.522710 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44a94d7ec35027b83e511734225c53ab9413562833af194fa132ffd5f7e18739" Oct 04 03:30:03 crc kubenswrapper[4964]: I1004 03:30:03.522309 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325810-p7mrl" Oct 04 03:30:04 crc kubenswrapper[4964]: I1004 03:30:04.094341 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67"] Oct 04 03:30:04 crc kubenswrapper[4964]: I1004 03:30:04.104178 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325765-t4h67"] Oct 04 03:30:04 crc kubenswrapper[4964]: I1004 03:30:04.859178 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c9243f1-7ba9-4c8d-bcc8-f90c3104852b" path="/var/lib/kubelet/pods/5c9243f1-7ba9-4c8d-bcc8-f90c3104852b/volumes" Oct 04 03:30:09 crc kubenswrapper[4964]: I1004 03:30:09.845484 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:30:09 crc kubenswrapper[4964]: E1004 03:30:09.846587 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:30:21 crc kubenswrapper[4964]: I1004 03:30:21.845938 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:30:21 crc kubenswrapper[4964]: E1004 03:30:21.846761 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:30:29 crc kubenswrapper[4964]: I1004 03:30:29.354789 4964 scope.go:117] "RemoveContainer" containerID="385ada5cd8db23138e0663e82a5fdadb43dff2f7b9bc0d0bc4106a7cda2d7ceb" Oct 04 03:30:32 crc kubenswrapper[4964]: I1004 03:30:32.845828 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:30:32 crc kubenswrapper[4964]: E1004 03:30:32.846574 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:30:43 crc kubenswrapper[4964]: I1004 03:30:43.845422 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:30:43 crc kubenswrapper[4964]: E1004 03:30:43.846204 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:30:55 crc kubenswrapper[4964]: I1004 03:30:55.846211 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:30:55 crc kubenswrapper[4964]: E1004 03:30:55.847319 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:30:57 crc kubenswrapper[4964]: I1004 03:30:57.105145 4964 generic.go:334] "Generic (PLEG): container finished" podID="2882ad3d-53fb-4ccf-aa3b-fe34165726a4" containerID="119e83a371ce7a08b731b8ed29f415a089939d53119cbd8db8a6482bfc026594" exitCode=0 Oct 04 03:30:57 crc kubenswrapper[4964]: I1004 03:30:57.105279 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" event={"ID":"2882ad3d-53fb-4ccf-aa3b-fe34165726a4","Type":"ContainerDied","Data":"119e83a371ce7a08b731b8ed29f415a089939d53119cbd8db8a6482bfc026594"} Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.589287 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.707000 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ceph-nova-0\") pod \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.707303 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ssh-key\") pod \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.707333 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-migration-ssh-key-0\") pod \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.707412 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-custom-ceph-combined-ca-bundle\") pod \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.707473 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-cell1-compute-config-0\") pod \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.707526 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-cell1-compute-config-1\") pod \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.707552 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qtpb\" (UniqueName: \"kubernetes.io/projected/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-kube-api-access-4qtpb\") pod \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.707598 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-inventory\") pod \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.707695 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-migration-ssh-key-1\") pod \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.707722 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-extra-config-0\") pod \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.707743 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ceph\") pod \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\" (UID: \"2882ad3d-53fb-4ccf-aa3b-fe34165726a4\") " Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.714153 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ceph" (OuterVolumeSpecName: "ceph") pod "2882ad3d-53fb-4ccf-aa3b-fe34165726a4" (UID: "2882ad3d-53fb-4ccf-aa3b-fe34165726a4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.715064 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "2882ad3d-53fb-4ccf-aa3b-fe34165726a4" (UID: "2882ad3d-53fb-4ccf-aa3b-fe34165726a4"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.716927 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-kube-api-access-4qtpb" (OuterVolumeSpecName: "kube-api-access-4qtpb") pod "2882ad3d-53fb-4ccf-aa3b-fe34165726a4" (UID: "2882ad3d-53fb-4ccf-aa3b-fe34165726a4"). InnerVolumeSpecName "kube-api-access-4qtpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.738034 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "2882ad3d-53fb-4ccf-aa3b-fe34165726a4" (UID: "2882ad3d-53fb-4ccf-aa3b-fe34165726a4"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.740586 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "2882ad3d-53fb-4ccf-aa3b-fe34165726a4" (UID: "2882ad3d-53fb-4ccf-aa3b-fe34165726a4"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.741335 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-inventory" (OuterVolumeSpecName: "inventory") pod "2882ad3d-53fb-4ccf-aa3b-fe34165726a4" (UID: "2882ad3d-53fb-4ccf-aa3b-fe34165726a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.753457 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "2882ad3d-53fb-4ccf-aa3b-fe34165726a4" (UID: "2882ad3d-53fb-4ccf-aa3b-fe34165726a4"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.760364 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "2882ad3d-53fb-4ccf-aa3b-fe34165726a4" (UID: "2882ad3d-53fb-4ccf-aa3b-fe34165726a4"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.760522 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2882ad3d-53fb-4ccf-aa3b-fe34165726a4" (UID: "2882ad3d-53fb-4ccf-aa3b-fe34165726a4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.770978 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "2882ad3d-53fb-4ccf-aa3b-fe34165726a4" (UID: "2882ad3d-53fb-4ccf-aa3b-fe34165726a4"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.779053 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "2882ad3d-53fb-4ccf-aa3b-fe34165726a4" (UID: "2882ad3d-53fb-4ccf-aa3b-fe34165726a4"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.811608 4964 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.811685 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.811709 4964 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.811730 4964 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.811752 4964 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.811772 4964 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.811790 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qtpb\" (UniqueName: \"kubernetes.io/projected/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-kube-api-access-4qtpb\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.811808 4964 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.811826 4964 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.811844 4964 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:58 crc kubenswrapper[4964]: I1004 03:30:58.811863 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2882ad3d-53fb-4ccf-aa3b-fe34165726a4-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:30:59 crc kubenswrapper[4964]: I1004 03:30:59.130513 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" event={"ID":"2882ad3d-53fb-4ccf-aa3b-fe34165726a4","Type":"ContainerDied","Data":"36b9844b0f21e4e3dec9da6b8e6a6ad508eb480726d2bead0966940f4493fd8a"} Oct 04 03:30:59 crc kubenswrapper[4964]: I1004 03:30:59.130568 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36b9844b0f21e4e3dec9da6b8e6a6ad508eb480726d2bead0966940f4493fd8a" Oct 04 03:30:59 crc kubenswrapper[4964]: I1004 03:30:59.130653 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds" Oct 04 03:31:07 crc kubenswrapper[4964]: I1004 03:31:07.845403 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:31:07 crc kubenswrapper[4964]: E1004 03:31:07.846728 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.676180 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 04 03:31:12 crc kubenswrapper[4964]: E1004 03:31:12.676922 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2882ad3d-53fb-4ccf-aa3b-fe34165726a4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.676938 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="2882ad3d-53fb-4ccf-aa3b-fe34165726a4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 04 03:31:12 crc kubenswrapper[4964]: E1004 03:31:12.676958 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbb1675-5ef4-4649-a948-640d25df34b1" containerName="collect-profiles" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.676964 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbb1675-5ef4-4649-a948-640d25df34b1" containerName="collect-profiles" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.677115 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="acbb1675-5ef4-4649-a948-640d25df34b1" containerName="collect-profiles" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.677128 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="2882ad3d-53fb-4ccf-aa3b-fe34165726a4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.678264 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.680306 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.680469 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.711795 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.726837 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.728600 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.733843 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.757071 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800317 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-config-data\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800351 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800388 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800420 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800438 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1eb33c04-b905-4472-839d-89537682be92-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800455 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-ceph\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800472 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800487 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-run\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800504 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800525 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800539 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800556 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-config-data-custom\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800570 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-run\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800597 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb33c04-b905-4472-839d-89537682be92-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800624 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlgpx\" (UniqueName: \"kubernetes.io/projected/1eb33c04-b905-4472-839d-89537682be92-kube-api-access-mlgpx\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800638 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-scripts\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800654 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-lib-modules\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800668 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-sys\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800684 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800713 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800728 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-etc-nvme\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800751 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb33c04-b905-4472-839d-89537682be92-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800934 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800951 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800964 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-dev\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800979 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb33c04-b905-4472-839d-89537682be92-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.800996 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.801010 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.801027 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwmkz\" (UniqueName: \"kubernetes.io/projected/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-kube-api-access-wwmkz\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.801058 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-dev\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.801077 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eb33c04-b905-4472-839d-89537682be92-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.801096 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-sys\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903189 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903231 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-config-data-custom\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903254 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903280 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-run\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903337 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-scripts\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903357 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb33c04-b905-4472-839d-89537682be92-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903378 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlgpx\" (UniqueName: \"kubernetes.io/projected/1eb33c04-b905-4472-839d-89537682be92-kube-api-access-mlgpx\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903401 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-lib-modules\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903424 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-sys\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903443 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903470 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903482 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903518 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-etc-nvme\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903550 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb33c04-b905-4472-839d-89537682be92-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903593 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903632 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903653 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-dev\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903673 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb33c04-b905-4472-839d-89537682be92-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903699 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903721 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903757 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwmkz\" (UniqueName: \"kubernetes.io/projected/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-kube-api-access-wwmkz\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903782 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-dev\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903808 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eb33c04-b905-4472-839d-89537682be92-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903832 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-sys\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903863 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-config-data\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903886 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903939 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.903993 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904016 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1eb33c04-b905-4472-839d-89537682be92-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904037 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904056 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-ceph\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904077 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-run\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904112 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904204 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904240 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-dev\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904738 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904770 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904793 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-lib-modules\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904819 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-sys\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904827 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-dev\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904841 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.904880 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.906117 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-etc-nvme\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.906203 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.907233 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-run\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.907280 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.907430 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-sys\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.907970 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.907974 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.908013 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-run\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.908048 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1eb33c04-b905-4472-839d-89537682be92-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.908075 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.909543 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb33c04-b905-4472-839d-89537682be92-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.909942 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.911567 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb33c04-b905-4472-839d-89537682be92-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.914990 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-config-data\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.915646 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-ceph\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.917385 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-config-data-custom\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.919383 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-scripts\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.921508 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb33c04-b905-4472-839d-89537682be92-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.923176 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1eb33c04-b905-4472-839d-89537682be92-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.924253 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlgpx\" (UniqueName: \"kubernetes.io/projected/1eb33c04-b905-4472-839d-89537682be92-kube-api-access-mlgpx\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.925461 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwmkz\" (UniqueName: \"kubernetes.io/projected/68b726b8-57ae-48e1-ba37-9e0be7cc3f79-kube-api-access-wwmkz\") pod \"cinder-backup-0\" (UID: \"68b726b8-57ae-48e1-ba37-9e0be7cc3f79\") " pod="openstack/cinder-backup-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.941228 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eb33c04-b905-4472-839d-89537682be92-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"1eb33c04-b905-4472-839d-89537682be92\") " pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:12 crc kubenswrapper[4964]: I1004 03:31:12.994661 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.049840 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.259975 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-snvm2"] Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.261949 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-snvm2" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.284529 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-snvm2"] Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.325356 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22fbc\" (UniqueName: \"kubernetes.io/projected/4b92794e-7e78-4c34-8967-4c7354aa9df2-kube-api-access-22fbc\") pod \"manila-db-create-snvm2\" (UID: \"4b92794e-7e78-4c34-8967-4c7354aa9df2\") " pod="openstack/manila-db-create-snvm2" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.393468 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6554d5fc67-d5vgm"] Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.395575 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.403856 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.404003 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-sr77l" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.404118 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.404247 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.409939 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6554d5fc67-d5vgm"] Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.427225 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22fbc\" (UniqueName: \"kubernetes.io/projected/4b92794e-7e78-4c34-8967-4c7354aa9df2-kube-api-access-22fbc\") pod \"manila-db-create-snvm2\" (UID: \"4b92794e-7e78-4c34-8967-4c7354aa9df2\") " pod="openstack/manila-db-create-snvm2" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.465011 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22fbc\" (UniqueName: \"kubernetes.io/projected/4b92794e-7e78-4c34-8967-4c7354aa9df2-kube-api-access-22fbc\") pod \"manila-db-create-snvm2\" (UID: \"4b92794e-7e78-4c34-8967-4c7354aa9df2\") " pod="openstack/manila-db-create-snvm2" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.538689 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-66574674fc-j28wh"] Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.560811 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66574674fc-j28wh"] Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.561015 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.576202 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-logs\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.576359 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-scripts\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.576388 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcmbz\" (UniqueName: \"kubernetes.io/projected/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-kube-api-access-fcmbz\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.576410 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-horizon-secret-key\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.576483 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-config-data\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.612413 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.612980 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-snvm2" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.650288 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.660032 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.660236 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.660389 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.660555 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rkxfr" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.679701 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.680724 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5445c577-835c-432b-88da-5fe7a9107cac-horizon-secret-key\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.680787 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-config-data\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.680816 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l846f\" (UniqueName: \"kubernetes.io/projected/5445c577-835c-432b-88da-5fe7a9107cac-kube-api-access-l846f\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.680844 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-logs\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.680914 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5445c577-835c-432b-88da-5fe7a9107cac-config-data\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.680949 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5445c577-835c-432b-88da-5fe7a9107cac-scripts\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.680985 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5445c577-835c-432b-88da-5fe7a9107cac-logs\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.681002 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-scripts\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.681020 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcmbz\" (UniqueName: \"kubernetes.io/projected/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-kube-api-access-fcmbz\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.681037 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-horizon-secret-key\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.692623 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-config-data\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.692868 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-logs\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.693468 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-scripts\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.697832 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-horizon-secret-key\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.762262 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcmbz\" (UniqueName: \"kubernetes.io/projected/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-kube-api-access-fcmbz\") pod \"horizon-6554d5fc67-d5vgm\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.762331 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.763940 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.775066 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.775283 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.780299 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782188 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l846f\" (UniqueName: \"kubernetes.io/projected/5445c577-835c-432b-88da-5fe7a9107cac-kube-api-access-l846f\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782227 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-config-data\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782287 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfde5846-8a91-45d4-91f9-1c0ae1d16442-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782315 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgtjc\" (UniqueName: \"kubernetes.io/projected/dfde5846-8a91-45d4-91f9-1c0ae1d16442-kube-api-access-rgtjc\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782333 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782355 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5445c577-835c-432b-88da-5fe7a9107cac-config-data\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782389 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5445c577-835c-432b-88da-5fe7a9107cac-scripts\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782412 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782434 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782572 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5445c577-835c-432b-88da-5fe7a9107cac-logs\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782603 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dfde5846-8a91-45d4-91f9-1c0ae1d16442-ceph\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782668 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5445c577-835c-432b-88da-5fe7a9107cac-horizon-secret-key\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782705 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfde5846-8a91-45d4-91f9-1c0ae1d16442-logs\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.782725 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-scripts\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.784219 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5445c577-835c-432b-88da-5fe7a9107cac-config-data\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.784441 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5445c577-835c-432b-88da-5fe7a9107cac-logs\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.785040 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5445c577-835c-432b-88da-5fe7a9107cac-scripts\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.786827 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5445c577-835c-432b-88da-5fe7a9107cac-horizon-secret-key\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.804820 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l846f\" (UniqueName: \"kubernetes.io/projected/5445c577-835c-432b-88da-5fe7a9107cac-kube-api-access-l846f\") pod \"horizon-66574674fc-j28wh\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: W1004 03:31:13.851926 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1eb33c04_b905_4472_839d_89537682be92.slice/crio-c254afd0a1c400c24fb5530b262d8eb3c8805f069df9ec06dc5d241d88c68dfd WatchSource:0}: Error finding container c254afd0a1c400c24fb5530b262d8eb3c8805f069df9ec06dc5d241d88c68dfd: Status 404 returned error can't find the container with id c254afd0a1c400c24fb5530b262d8eb3c8805f069df9ec06dc5d241d88c68dfd Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.853318 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.885520 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8dacfd57-acf5-4306-9488-f1130ccc0689-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.885566 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-config-data\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.885585 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.885820 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dacfd57-acf5-4306-9488-f1130ccc0689-logs\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.885899 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz4h9\" (UniqueName: \"kubernetes.io/projected/8dacfd57-acf5-4306-9488-f1130ccc0689-kube-api-access-gz4h9\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.886562 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfde5846-8a91-45d4-91f9-1c0ae1d16442-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.886655 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgtjc\" (UniqueName: \"kubernetes.io/projected/dfde5846-8a91-45d4-91f9-1c0ae1d16442-kube-api-access-rgtjc\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.886688 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.886740 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.886834 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.886866 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.886952 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfde5846-8a91-45d4-91f9-1c0ae1d16442-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.886901 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.887246 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.889138 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.889235 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dfde5846-8a91-45d4-91f9-1c0ae1d16442-ceph\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.889359 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.889469 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfde5846-8a91-45d4-91f9-1c0ae1d16442-logs\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.889496 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-scripts\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.889557 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dacfd57-acf5-4306-9488-f1130ccc0689-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.890137 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfde5846-8a91-45d4-91f9-1c0ae1d16442-logs\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.890943 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.894437 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.900040 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-scripts\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.901652 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-config-data\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.904746 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgtjc\" (UniqueName: \"kubernetes.io/projected/dfde5846-8a91-45d4-91f9-1c0ae1d16442-kube-api-access-rgtjc\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.915968 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.916488 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dfde5846-8a91-45d4-91f9-1c0ae1d16442-ceph\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.935854 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.991207 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.991284 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dacfd57-acf5-4306-9488-f1130ccc0689-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.991303 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8dacfd57-acf5-4306-9488-f1130ccc0689-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.991321 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.991344 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dacfd57-acf5-4306-9488-f1130ccc0689-logs\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.991368 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz4h9\" (UniqueName: \"kubernetes.io/projected/8dacfd57-acf5-4306-9488-f1130ccc0689-kube-api-access-gz4h9\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.991435 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.991474 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.991497 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.991665 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.991750 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dacfd57-acf5-4306-9488-f1130ccc0689-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.993669 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dacfd57-acf5-4306-9488-f1130ccc0689-logs\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:13 crc kubenswrapper[4964]: I1004 03:31:13.997912 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.001393 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.001742 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.001836 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8dacfd57-acf5-4306-9488-f1130ccc0689-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.002650 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.017813 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz4h9\" (UniqueName: \"kubernetes.io/projected/8dacfd57-acf5-4306-9488-f1130ccc0689-kube-api-access-gz4h9\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.022496 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.034163 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.038733 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.101265 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:14 crc kubenswrapper[4964]: W1004 03:31:14.205211 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b92794e_7e78_4c34_8967_4c7354aa9df2.slice/crio-0f08b7f5c395470df6acfd91616bfa521e993e3e0b6315b530862b9a764c4c6c WatchSource:0}: Error finding container 0f08b7f5c395470df6acfd91616bfa521e993e3e0b6315b530862b9a764c4c6c: Status 404 returned error can't find the container with id 0f08b7f5c395470df6acfd91616bfa521e993e3e0b6315b530862b9a764c4c6c Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.210427 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-snvm2"] Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.345333 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"1eb33c04-b905-4472-839d-89537682be92","Type":"ContainerStarted","Data":"c254afd0a1c400c24fb5530b262d8eb3c8805f069df9ec06dc5d241d88c68dfd"} Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.347077 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-snvm2" event={"ID":"4b92794e-7e78-4c34-8967-4c7354aa9df2","Type":"ContainerStarted","Data":"0f08b7f5c395470df6acfd91616bfa521e993e3e0b6315b530862b9a764c4c6c"} Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.352000 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66574674fc-j28wh"] Oct 04 03:31:14 crc kubenswrapper[4964]: W1004 03:31:14.362144 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5445c577_835c_432b_88da_5fe7a9107cac.slice/crio-0aa030399cef43c850f53d761fe47758ecf88c1a4258fd81a4d164d9e8150443 WatchSource:0}: Error finding container 0aa030399cef43c850f53d761fe47758ecf88c1a4258fd81a4d164d9e8150443: Status 404 returned error can't find the container with id 0aa030399cef43c850f53d761fe47758ecf88c1a4258fd81a4d164d9e8150443 Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.562553 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6554d5fc67-d5vgm"] Oct 04 03:31:14 crc kubenswrapper[4964]: I1004 03:31:14.699085 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 03:31:14 crc kubenswrapper[4964]: W1004 03:31:14.709029 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfde5846_8a91_45d4_91f9_1c0ae1d16442.slice/crio-7fbfc58c6cb92ea34c6dc7cdb709957b7e83f8b27cafb71dfc1963060dbf47ae WatchSource:0}: Error finding container 7fbfc58c6cb92ea34c6dc7cdb709957b7e83f8b27cafb71dfc1963060dbf47ae: Status 404 returned error can't find the container with id 7fbfc58c6cb92ea34c6dc7cdb709957b7e83f8b27cafb71dfc1963060dbf47ae Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.352012 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lzczl"] Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.355099 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.359683 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"1eb33c04-b905-4472-839d-89537682be92","Type":"ContainerStarted","Data":"384e91ca3ec27e7bb84ce3903869a24988e506f00709469a0ea952ca8360e2e2"} Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.359728 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"1eb33c04-b905-4472-839d-89537682be92","Type":"ContainerStarted","Data":"18aaf2416743b1cef22ca528326ba7d6545ebf2d9726bb6387184975969dea6f"} Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.361330 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lzczl"] Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.366420 4964 generic.go:334] "Generic (PLEG): container finished" podID="4b92794e-7e78-4c34-8967-4c7354aa9df2" containerID="4a43c8a8c9c57712d385a29d486c1d47a82d1d2a57dfe5b518b1652837361e58" exitCode=0 Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.366482 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-snvm2" event={"ID":"4b92794e-7e78-4c34-8967-4c7354aa9df2","Type":"ContainerDied","Data":"4a43c8a8c9c57712d385a29d486c1d47a82d1d2a57dfe5b518b1652837361e58"} Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.368958 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6554d5fc67-d5vgm" event={"ID":"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d","Type":"ContainerStarted","Data":"1ab3f4210a94abe3baa23dd8411c0c61b5dfd3319dad37c40905fd95336b3956"} Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.375359 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dfde5846-8a91-45d4-91f9-1c0ae1d16442","Type":"ContainerStarted","Data":"944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b"} Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.375402 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dfde5846-8a91-45d4-91f9-1c0ae1d16442","Type":"ContainerStarted","Data":"7fbfc58c6cb92ea34c6dc7cdb709957b7e83f8b27cafb71dfc1963060dbf47ae"} Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.404118 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66574674fc-j28wh" event={"ID":"5445c577-835c-432b-88da-5fe7a9107cac","Type":"ContainerStarted","Data":"0aa030399cef43c850f53d761fe47758ecf88c1a4258fd81a4d164d9e8150443"} Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.421237 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.660996217 podStartE2EDuration="3.421219951s" podCreationTimestamp="2025-10-04 03:31:12 +0000 UTC" firstStartedPulling="2025-10-04 03:31:13.853227979 +0000 UTC m=+3053.750186617" lastFinishedPulling="2025-10-04 03:31:14.613451713 +0000 UTC m=+3054.510410351" observedRunningTime="2025-10-04 03:31:15.420737758 +0000 UTC m=+3055.317696406" watchObservedRunningTime="2025-10-04 03:31:15.421219951 +0000 UTC m=+3055.318178589" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.431034 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfhfc\" (UniqueName: \"kubernetes.io/projected/944df5c7-1a65-4893-bc06-306422fcb360-kube-api-access-tfhfc\") pod \"certified-operators-lzczl\" (UID: \"944df5c7-1a65-4893-bc06-306422fcb360\") " pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.431120 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944df5c7-1a65-4893-bc06-306422fcb360-catalog-content\") pod \"certified-operators-lzczl\" (UID: \"944df5c7-1a65-4893-bc06-306422fcb360\") " pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.431155 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944df5c7-1a65-4893-bc06-306422fcb360-utilities\") pod \"certified-operators-lzczl\" (UID: \"944df5c7-1a65-4893-bc06-306422fcb360\") " pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.465484 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.533233 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfhfc\" (UniqueName: \"kubernetes.io/projected/944df5c7-1a65-4893-bc06-306422fcb360-kube-api-access-tfhfc\") pod \"certified-operators-lzczl\" (UID: \"944df5c7-1a65-4893-bc06-306422fcb360\") " pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.551760 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944df5c7-1a65-4893-bc06-306422fcb360-catalog-content\") pod \"certified-operators-lzczl\" (UID: \"944df5c7-1a65-4893-bc06-306422fcb360\") " pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.551893 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944df5c7-1a65-4893-bc06-306422fcb360-utilities\") pod \"certified-operators-lzczl\" (UID: \"944df5c7-1a65-4893-bc06-306422fcb360\") " pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.552700 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944df5c7-1a65-4893-bc06-306422fcb360-utilities\") pod \"certified-operators-lzczl\" (UID: \"944df5c7-1a65-4893-bc06-306422fcb360\") " pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.552959 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944df5c7-1a65-4893-bc06-306422fcb360-catalog-content\") pod \"certified-operators-lzczl\" (UID: \"944df5c7-1a65-4893-bc06-306422fcb360\") " pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.554168 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfhfc\" (UniqueName: \"kubernetes.io/projected/944df5c7-1a65-4893-bc06-306422fcb360-kube-api-access-tfhfc\") pod \"certified-operators-lzczl\" (UID: \"944df5c7-1a65-4893-bc06-306422fcb360\") " pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.786017 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6554d5fc67-d5vgm"] Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.801279 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.821335 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.828137 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67d99bc788-zm78q"] Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.829702 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.837346 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.858794 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jffnt\" (UniqueName: \"kubernetes.io/projected/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-kube-api-access-jffnt\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.858855 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-combined-ca-bundle\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.858903 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-logs\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.858948 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-horizon-tls-certs\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.858984 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-horizon-secret-key\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.859004 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-config-data\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.859020 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-scripts\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.860195 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67d99bc788-zm78q"] Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.871873 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.923300 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66574674fc-j28wh"] Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.940048 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64f9d99668-zvfzz"] Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.941491 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.951001 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64f9d99668-zvfzz"] Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.971673 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-logs\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.971750 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-horizon-tls-certs\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.971792 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-horizon-secret-key\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.971813 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-config-data\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.971828 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-scripts\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.971902 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jffnt\" (UniqueName: \"kubernetes.io/projected/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-kube-api-access-jffnt\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.971947 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-combined-ca-bundle\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.978913 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-scripts\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.979783 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-config-data\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.979990 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-logs\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:15 crc kubenswrapper[4964]: I1004 03:31:15.982257 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-horizon-secret-key\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.000290 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-combined-ca-bundle\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.002127 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-horizon-tls-certs\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.017582 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jffnt\" (UniqueName: \"kubernetes.io/projected/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-kube-api-access-jffnt\") pod \"horizon-67d99bc788-zm78q\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.061055 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.089655 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-combined-ca-bundle\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.090077 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-horizon-tls-certs\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.090185 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-scripts\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.090321 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-config-data\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.090390 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvv5h\" (UniqueName: \"kubernetes.io/projected/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-kube-api-access-xvv5h\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.090463 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-horizon-secret-key\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.090516 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-logs\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.161609 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.214961 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-horizon-tls-certs\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.215027 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-scripts\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.215214 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-config-data\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.215284 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvv5h\" (UniqueName: \"kubernetes.io/projected/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-kube-api-access-xvv5h\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.215317 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-horizon-secret-key\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.215343 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-logs\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.215418 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-combined-ca-bundle\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.217083 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-scripts\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.218114 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-config-data\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.219968 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-combined-ca-bundle\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.220893 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-logs\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.223013 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-horizon-tls-certs\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.230042 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-horizon-secret-key\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.244812 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvv5h\" (UniqueName: \"kubernetes.io/projected/39f65132-4f7b-4c79-ba9b-e86c15ec60d6-kube-api-access-xvv5h\") pod \"horizon-64f9d99668-zvfzz\" (UID: \"39f65132-4f7b-4c79-ba9b-e86c15ec60d6\") " pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.390945 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.415992 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"68b726b8-57ae-48e1-ba37-9e0be7cc3f79","Type":"ContainerStarted","Data":"823ee8806f5a87f9e5ab14c94b85cd7034704a8f16395abb9c1eccfacf3cfa96"} Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.418538 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8dacfd57-acf5-4306-9488-f1130ccc0689","Type":"ContainerStarted","Data":"04e8055be55f4242f51dd993adfe0614c0da70602fd25557f6cfb2235a48817f"} Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.605543 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lzczl"] Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.720551 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67d99bc788-zm78q"] Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.854438 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-snvm2" Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.932289 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22fbc\" (UniqueName: \"kubernetes.io/projected/4b92794e-7e78-4c34-8967-4c7354aa9df2-kube-api-access-22fbc\") pod \"4b92794e-7e78-4c34-8967-4c7354aa9df2\" (UID: \"4b92794e-7e78-4c34-8967-4c7354aa9df2\") " Oct 04 03:31:16 crc kubenswrapper[4964]: I1004 03:31:16.938668 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b92794e-7e78-4c34-8967-4c7354aa9df2-kube-api-access-22fbc" (OuterVolumeSpecName: "kube-api-access-22fbc") pod "4b92794e-7e78-4c34-8967-4c7354aa9df2" (UID: "4b92794e-7e78-4c34-8967-4c7354aa9df2"). InnerVolumeSpecName "kube-api-access-22fbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.024998 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64f9d99668-zvfzz"] Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.035101 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22fbc\" (UniqueName: \"kubernetes.io/projected/4b92794e-7e78-4c34-8967-4c7354aa9df2-kube-api-access-22fbc\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.433992 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f9d99668-zvfzz" event={"ID":"39f65132-4f7b-4c79-ba9b-e86c15ec60d6","Type":"ContainerStarted","Data":"bcdc85836f15d5dea6b99889a5a9930cd4413a70d22e811ca5ed2fa692efcf56"} Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.437343 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"68b726b8-57ae-48e1-ba37-9e0be7cc3f79","Type":"ContainerStarted","Data":"ff0abc7685850f02d3136470a5680d92436705f317e3633d388960a86f992454"} Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.438558 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67d99bc788-zm78q" event={"ID":"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc","Type":"ContainerStarted","Data":"c5f02cb8cd8dfe4eba9e8208ea2d34143ecd582315073ad90955b8be4817ffca"} Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.440230 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-snvm2" event={"ID":"4b92794e-7e78-4c34-8967-4c7354aa9df2","Type":"ContainerDied","Data":"0f08b7f5c395470df6acfd91616bfa521e993e3e0b6315b530862b9a764c4c6c"} Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.440303 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f08b7f5c395470df6acfd91616bfa521e993e3e0b6315b530862b9a764c4c6c" Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.440260 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-snvm2" Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.455806 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8dacfd57-acf5-4306-9488-f1130ccc0689","Type":"ContainerStarted","Data":"ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f"} Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.466734 4964 generic.go:334] "Generic (PLEG): container finished" podID="944df5c7-1a65-4893-bc06-306422fcb360" containerID="61fb44aa24560f419384d8cc3d72d6dd6ab1d86780e207e42935c04c7fe8df2b" exitCode=0 Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.466983 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzczl" event={"ID":"944df5c7-1a65-4893-bc06-306422fcb360","Type":"ContainerDied","Data":"61fb44aa24560f419384d8cc3d72d6dd6ab1d86780e207e42935c04c7fe8df2b"} Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.467024 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzczl" event={"ID":"944df5c7-1a65-4893-bc06-306422fcb360","Type":"ContainerStarted","Data":"a5092033c369419be2fef1c364bad892be95c682e327340e530edb4d078afeec"} Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.474787 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dfde5846-8a91-45d4-91f9-1c0ae1d16442","Type":"ContainerStarted","Data":"a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4"} Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.474900 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dfde5846-8a91-45d4-91f9-1c0ae1d16442" containerName="glance-log" containerID="cri-o://944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b" gracePeriod=30 Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.475108 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dfde5846-8a91-45d4-91f9-1c0ae1d16442" containerName="glance-httpd" containerID="cri-o://a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4" gracePeriod=30 Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.516677 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.516662709 podStartE2EDuration="4.516662709s" podCreationTimestamp="2025-10-04 03:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:31:17.511648846 +0000 UTC m=+3057.408607474" watchObservedRunningTime="2025-10-04 03:31:17.516662709 +0000 UTC m=+3057.413621347" Oct 04 03:31:17 crc kubenswrapper[4964]: I1004 03:31:17.995148 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.150367 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.256557 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-config-data\") pod \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.256653 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-combined-ca-bundle\") pod \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.256694 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfde5846-8a91-45d4-91f9-1c0ae1d16442-logs\") pod \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.257456 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-scripts\") pod \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.257755 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfde5846-8a91-45d4-91f9-1c0ae1d16442-httpd-run\") pod \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.257837 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.257871 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgtjc\" (UniqueName: \"kubernetes.io/projected/dfde5846-8a91-45d4-91f9-1c0ae1d16442-kube-api-access-rgtjc\") pod \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.257975 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-public-tls-certs\") pod \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.258082 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dfde5846-8a91-45d4-91f9-1c0ae1d16442-ceph\") pod \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\" (UID: \"dfde5846-8a91-45d4-91f9-1c0ae1d16442\") " Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.258279 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfde5846-8a91-45d4-91f9-1c0ae1d16442-logs" (OuterVolumeSpecName: "logs") pod "dfde5846-8a91-45d4-91f9-1c0ae1d16442" (UID: "dfde5846-8a91-45d4-91f9-1c0ae1d16442"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.258310 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfde5846-8a91-45d4-91f9-1c0ae1d16442-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dfde5846-8a91-45d4-91f9-1c0ae1d16442" (UID: "dfde5846-8a91-45d4-91f9-1c0ae1d16442"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.258959 4964 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfde5846-8a91-45d4-91f9-1c0ae1d16442-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.258987 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfde5846-8a91-45d4-91f9-1c0ae1d16442-logs\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.263602 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "dfde5846-8a91-45d4-91f9-1c0ae1d16442" (UID: "dfde5846-8a91-45d4-91f9-1c0ae1d16442"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.263692 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfde5846-8a91-45d4-91f9-1c0ae1d16442-kube-api-access-rgtjc" (OuterVolumeSpecName: "kube-api-access-rgtjc") pod "dfde5846-8a91-45d4-91f9-1c0ae1d16442" (UID: "dfde5846-8a91-45d4-91f9-1c0ae1d16442"). InnerVolumeSpecName "kube-api-access-rgtjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.276861 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfde5846-8a91-45d4-91f9-1c0ae1d16442-ceph" (OuterVolumeSpecName: "ceph") pod "dfde5846-8a91-45d4-91f9-1c0ae1d16442" (UID: "dfde5846-8a91-45d4-91f9-1c0ae1d16442"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.289537 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-scripts" (OuterVolumeSpecName: "scripts") pod "dfde5846-8a91-45d4-91f9-1c0ae1d16442" (UID: "dfde5846-8a91-45d4-91f9-1c0ae1d16442"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.333731 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-config-data" (OuterVolumeSpecName: "config-data") pod "dfde5846-8a91-45d4-91f9-1c0ae1d16442" (UID: "dfde5846-8a91-45d4-91f9-1c0ae1d16442"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.342456 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfde5846-8a91-45d4-91f9-1c0ae1d16442" (UID: "dfde5846-8a91-45d4-91f9-1c0ae1d16442"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.360535 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dfde5846-8a91-45d4-91f9-1c0ae1d16442-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.360812 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.361008 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.361133 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.361255 4964 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.361342 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgtjc\" (UniqueName: \"kubernetes.io/projected/dfde5846-8a91-45d4-91f9-1c0ae1d16442-kube-api-access-rgtjc\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.374469 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dfde5846-8a91-45d4-91f9-1c0ae1d16442" (UID: "dfde5846-8a91-45d4-91f9-1c0ae1d16442"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.389280 4964 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.463045 4964 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.463075 4964 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfde5846-8a91-45d4-91f9-1c0ae1d16442-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.484228 4964 generic.go:334] "Generic (PLEG): container finished" podID="dfde5846-8a91-45d4-91f9-1c0ae1d16442" containerID="a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4" exitCode=0 Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.484256 4964 generic.go:334] "Generic (PLEG): container finished" podID="dfde5846-8a91-45d4-91f9-1c0ae1d16442" containerID="944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b" exitCode=143 Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.484279 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.484308 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dfde5846-8a91-45d4-91f9-1c0ae1d16442","Type":"ContainerDied","Data":"a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4"} Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.484337 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dfde5846-8a91-45d4-91f9-1c0ae1d16442","Type":"ContainerDied","Data":"944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b"} Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.484349 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dfde5846-8a91-45d4-91f9-1c0ae1d16442","Type":"ContainerDied","Data":"7fbfc58c6cb92ea34c6dc7cdb709957b7e83f8b27cafb71dfc1963060dbf47ae"} Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.484365 4964 scope.go:117] "RemoveContainer" containerID="a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.486585 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"68b726b8-57ae-48e1-ba37-9e0be7cc3f79","Type":"ContainerStarted","Data":"0ac5ec057869f0be3fa3ad2e6125173b81cdfd332bf416ab19e1e3e51a3ee22c"} Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.488302 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8dacfd57-acf5-4306-9488-f1130ccc0689","Type":"ContainerStarted","Data":"ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b"} Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.488432 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8dacfd57-acf5-4306-9488-f1130ccc0689" containerName="glance-log" containerID="cri-o://ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f" gracePeriod=30 Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.488453 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8dacfd57-acf5-4306-9488-f1130ccc0689" containerName="glance-httpd" containerID="cri-o://ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b" gracePeriod=30 Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.535049 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=5.437884825 podStartE2EDuration="6.535027347s" podCreationTimestamp="2025-10-04 03:31:12 +0000 UTC" firstStartedPulling="2025-10-04 03:31:15.90142571 +0000 UTC m=+3055.798384338" lastFinishedPulling="2025-10-04 03:31:16.998568222 +0000 UTC m=+3056.895526860" observedRunningTime="2025-10-04 03:31:18.52536263 +0000 UTC m=+3058.422321278" watchObservedRunningTime="2025-10-04 03:31:18.535027347 +0000 UTC m=+3058.431985985" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.553000 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.552983104 podStartE2EDuration="5.552983104s" podCreationTimestamp="2025-10-04 03:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:31:18.542576608 +0000 UTC m=+3058.439535246" watchObservedRunningTime="2025-10-04 03:31:18.552983104 +0000 UTC m=+3058.449941742" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.579473 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.608720 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.628544 4964 scope.go:117] "RemoveContainer" containerID="944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.632244 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 03:31:18 crc kubenswrapper[4964]: E1004 03:31:18.632590 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfde5846-8a91-45d4-91f9-1c0ae1d16442" containerName="glance-log" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.632604 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfde5846-8a91-45d4-91f9-1c0ae1d16442" containerName="glance-log" Oct 04 03:31:18 crc kubenswrapper[4964]: E1004 03:31:18.632637 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b92794e-7e78-4c34-8967-4c7354aa9df2" containerName="mariadb-database-create" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.632643 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b92794e-7e78-4c34-8967-4c7354aa9df2" containerName="mariadb-database-create" Oct 04 03:31:18 crc kubenswrapper[4964]: E1004 03:31:18.632656 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfde5846-8a91-45d4-91f9-1c0ae1d16442" containerName="glance-httpd" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.632662 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfde5846-8a91-45d4-91f9-1c0ae1d16442" containerName="glance-httpd" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.632833 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfde5846-8a91-45d4-91f9-1c0ae1d16442" containerName="glance-log" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.632850 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfde5846-8a91-45d4-91f9-1c0ae1d16442" containerName="glance-httpd" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.632869 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b92794e-7e78-4c34-8967-4c7354aa9df2" containerName="mariadb-database-create" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.634217 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.638074 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.638144 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.646798 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.695417 4964 scope.go:117] "RemoveContainer" containerID="a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4" Oct 04 03:31:18 crc kubenswrapper[4964]: E1004 03:31:18.696481 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4\": container with ID starting with a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4 not found: ID does not exist" containerID="a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.696521 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4"} err="failed to get container status \"a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4\": rpc error: code = NotFound desc = could not find container \"a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4\": container with ID starting with a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4 not found: ID does not exist" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.696548 4964 scope.go:117] "RemoveContainer" containerID="944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b" Oct 04 03:31:18 crc kubenswrapper[4964]: E1004 03:31:18.697082 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b\": container with ID starting with 944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b not found: ID does not exist" containerID="944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.697106 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b"} err="failed to get container status \"944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b\": rpc error: code = NotFound desc = could not find container \"944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b\": container with ID starting with 944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b not found: ID does not exist" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.697126 4964 scope.go:117] "RemoveContainer" containerID="a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.698597 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4"} err="failed to get container status \"a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4\": rpc error: code = NotFound desc = could not find container \"a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4\": container with ID starting with a1bd81a9653d48639df544ed3c53f89e71d87e1442b5b371ee221335d5efb6d4 not found: ID does not exist" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.698664 4964 scope.go:117] "RemoveContainer" containerID="944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.699005 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b"} err="failed to get container status \"944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b\": rpc error: code = NotFound desc = could not find container \"944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b\": container with ID starting with 944e98a3f8077e15b72f0a1f3fbe1806e132a6a4bf2837f32046964ffc0b719b not found: ID does not exist" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.771666 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36e18875-3436-4465-85fc-f0a240394665-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.771728 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36e18875-3436-4465-85fc-f0a240394665-config-data\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.771748 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36e18875-3436-4465-85fc-f0a240394665-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.771769 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36e18875-3436-4465-85fc-f0a240394665-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.771806 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36e18875-3436-4465-85fc-f0a240394665-logs\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.771833 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbsgt\" (UniqueName: \"kubernetes.io/projected/36e18875-3436-4465-85fc-f0a240394665-kube-api-access-gbsgt\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.771877 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/36e18875-3436-4465-85fc-f0a240394665-ceph\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.771908 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.771932 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36e18875-3436-4465-85fc-f0a240394665-scripts\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.861863 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfde5846-8a91-45d4-91f9-1c0ae1d16442" path="/var/lib/kubelet/pods/dfde5846-8a91-45d4-91f9-1c0ae1d16442/volumes" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.911137 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbsgt\" (UniqueName: \"kubernetes.io/projected/36e18875-3436-4465-85fc-f0a240394665-kube-api-access-gbsgt\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.911214 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/36e18875-3436-4465-85fc-f0a240394665-ceph\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.911253 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.911282 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36e18875-3436-4465-85fc-f0a240394665-scripts\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.911325 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36e18875-3436-4465-85fc-f0a240394665-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.911365 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36e18875-3436-4465-85fc-f0a240394665-config-data\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.911381 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36e18875-3436-4465-85fc-f0a240394665-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.911409 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36e18875-3436-4465-85fc-f0a240394665-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.911454 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36e18875-3436-4465-85fc-f0a240394665-logs\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.911708 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.912014 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36e18875-3436-4465-85fc-f0a240394665-logs\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.912481 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36e18875-3436-4465-85fc-f0a240394665-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.919417 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36e18875-3436-4465-85fc-f0a240394665-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.923212 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36e18875-3436-4465-85fc-f0a240394665-scripts\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.923817 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36e18875-3436-4465-85fc-f0a240394665-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.928914 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36e18875-3436-4465-85fc-f0a240394665-config-data\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.948079 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/36e18875-3436-4465-85fc-f0a240394665-ceph\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.948123 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbsgt\" (UniqueName: \"kubernetes.io/projected/36e18875-3436-4465-85fc-f0a240394665-kube-api-access-gbsgt\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.955982 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"36e18875-3436-4465-85fc-f0a240394665\") " pod="openstack/glance-default-external-api-0" Oct 04 03:31:18 crc kubenswrapper[4964]: I1004 03:31:18.968169 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.231206 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.332406 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-internal-tls-certs\") pod \"8dacfd57-acf5-4306-9488-f1130ccc0689\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.332461 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dacfd57-acf5-4306-9488-f1130ccc0689-httpd-run\") pod \"8dacfd57-acf5-4306-9488-f1130ccc0689\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.332578 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-combined-ca-bundle\") pod \"8dacfd57-acf5-4306-9488-f1130ccc0689\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.332648 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz4h9\" (UniqueName: \"kubernetes.io/projected/8dacfd57-acf5-4306-9488-f1130ccc0689-kube-api-access-gz4h9\") pod \"8dacfd57-acf5-4306-9488-f1130ccc0689\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.332699 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-config-data\") pod \"8dacfd57-acf5-4306-9488-f1130ccc0689\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.332742 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-scripts\") pod \"8dacfd57-acf5-4306-9488-f1130ccc0689\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.332774 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8dacfd57-acf5-4306-9488-f1130ccc0689-ceph\") pod \"8dacfd57-acf5-4306-9488-f1130ccc0689\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.332795 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8dacfd57-acf5-4306-9488-f1130ccc0689\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.332857 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dacfd57-acf5-4306-9488-f1130ccc0689-logs\") pod \"8dacfd57-acf5-4306-9488-f1130ccc0689\" (UID: \"8dacfd57-acf5-4306-9488-f1130ccc0689\") " Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.333093 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dacfd57-acf5-4306-9488-f1130ccc0689-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8dacfd57-acf5-4306-9488-f1130ccc0689" (UID: "8dacfd57-acf5-4306-9488-f1130ccc0689"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.333505 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dacfd57-acf5-4306-9488-f1130ccc0689-logs" (OuterVolumeSpecName: "logs") pod "8dacfd57-acf5-4306-9488-f1130ccc0689" (UID: "8dacfd57-acf5-4306-9488-f1130ccc0689"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.334953 4964 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dacfd57-acf5-4306-9488-f1130ccc0689-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.351361 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-scripts" (OuterVolumeSpecName: "scripts") pod "8dacfd57-acf5-4306-9488-f1130ccc0689" (UID: "8dacfd57-acf5-4306-9488-f1130ccc0689"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.351389 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "8dacfd57-acf5-4306-9488-f1130ccc0689" (UID: "8dacfd57-acf5-4306-9488-f1130ccc0689"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.351508 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dacfd57-acf5-4306-9488-f1130ccc0689-kube-api-access-gz4h9" (OuterVolumeSpecName: "kube-api-access-gz4h9") pod "8dacfd57-acf5-4306-9488-f1130ccc0689" (UID: "8dacfd57-acf5-4306-9488-f1130ccc0689"). InnerVolumeSpecName "kube-api-access-gz4h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.356064 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dacfd57-acf5-4306-9488-f1130ccc0689-ceph" (OuterVolumeSpecName: "ceph") pod "8dacfd57-acf5-4306-9488-f1130ccc0689" (UID: "8dacfd57-acf5-4306-9488-f1130ccc0689"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.375144 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dacfd57-acf5-4306-9488-f1130ccc0689" (UID: "8dacfd57-acf5-4306-9488-f1130ccc0689"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.400942 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-config-data" (OuterVolumeSpecName: "config-data") pod "8dacfd57-acf5-4306-9488-f1130ccc0689" (UID: "8dacfd57-acf5-4306-9488-f1130ccc0689"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.407717 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8dacfd57-acf5-4306-9488-f1130ccc0689" (UID: "8dacfd57-acf5-4306-9488-f1130ccc0689"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.436601 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz4h9\" (UniqueName: \"kubernetes.io/projected/8dacfd57-acf5-4306-9488-f1130ccc0689-kube-api-access-gz4h9\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.436655 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.436668 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.436679 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8dacfd57-acf5-4306-9488-f1130ccc0689-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.436717 4964 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.436730 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dacfd57-acf5-4306-9488-f1130ccc0689-logs\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.436740 4964 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.436749 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dacfd57-acf5-4306-9488-f1130ccc0689-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.469906 4964 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.503589 4964 generic.go:334] "Generic (PLEG): container finished" podID="8dacfd57-acf5-4306-9488-f1130ccc0689" containerID="ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b" exitCode=0 Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.504525 4964 generic.go:334] "Generic (PLEG): container finished" podID="8dacfd57-acf5-4306-9488-f1130ccc0689" containerID="ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f" exitCode=143 Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.503736 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8dacfd57-acf5-4306-9488-f1130ccc0689","Type":"ContainerDied","Data":"ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b"} Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.504661 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8dacfd57-acf5-4306-9488-f1130ccc0689","Type":"ContainerDied","Data":"ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f"} Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.504685 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8dacfd57-acf5-4306-9488-f1130ccc0689","Type":"ContainerDied","Data":"04e8055be55f4242f51dd993adfe0614c0da70602fd25557f6cfb2235a48817f"} Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.504707 4964 scope.go:117] "RemoveContainer" containerID="ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.503784 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.519123 4964 generic.go:334] "Generic (PLEG): container finished" podID="944df5c7-1a65-4893-bc06-306422fcb360" containerID="53fb14210708a4b01c6c5c7d019000a33646571405556fe915fad1560db977f4" exitCode=0 Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.519596 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzczl" event={"ID":"944df5c7-1a65-4893-bc06-306422fcb360","Type":"ContainerDied","Data":"53fb14210708a4b01c6c5c7d019000a33646571405556fe915fad1560db977f4"} Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.538863 4964 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.566628 4964 scope.go:117] "RemoveContainer" containerID="ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.569992 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.581979 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.591505 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.603966 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 03:31:19 crc kubenswrapper[4964]: E1004 03:31:19.604423 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dacfd57-acf5-4306-9488-f1130ccc0689" containerName="glance-httpd" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.604442 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dacfd57-acf5-4306-9488-f1130ccc0689" containerName="glance-httpd" Oct 04 03:31:19 crc kubenswrapper[4964]: E1004 03:31:19.604454 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dacfd57-acf5-4306-9488-f1130ccc0689" containerName="glance-log" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.604460 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dacfd57-acf5-4306-9488-f1130ccc0689" containerName="glance-log" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.604773 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dacfd57-acf5-4306-9488-f1130ccc0689" containerName="glance-log" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.604793 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dacfd57-acf5-4306-9488-f1130ccc0689" containerName="glance-httpd" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.607043 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.617197 4964 scope.go:117] "RemoveContainer" containerID="ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b" Oct 04 03:31:19 crc kubenswrapper[4964]: E1004 03:31:19.617854 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b\": container with ID starting with ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b not found: ID does not exist" containerID="ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.617898 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b"} err="failed to get container status \"ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b\": rpc error: code = NotFound desc = could not find container \"ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b\": container with ID starting with ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b not found: ID does not exist" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.617933 4964 scope.go:117] "RemoveContainer" containerID="ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.617943 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.618814 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.619070 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 04 03:31:19 crc kubenswrapper[4964]: E1004 03:31:19.622231 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f\": container with ID starting with ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f not found: ID does not exist" containerID="ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.622288 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f"} err="failed to get container status \"ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f\": rpc error: code = NotFound desc = could not find container \"ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f\": container with ID starting with ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f not found: ID does not exist" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.622317 4964 scope.go:117] "RemoveContainer" containerID="ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.624819 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b"} err="failed to get container status \"ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b\": rpc error: code = NotFound desc = could not find container \"ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b\": container with ID starting with ddf2d1409d85c50253e100e210bbff28650504df837a1ddcb4c010b8d2e0d48b not found: ID does not exist" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.624855 4964 scope.go:117] "RemoveContainer" containerID="ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.625468 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f"} err="failed to get container status \"ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f\": rpc error: code = NotFound desc = could not find container \"ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f\": container with ID starting with ddace60dfb42cf23790c2d1db13189b377311e6a88054e786b15d29bd0d0e96f not found: ID does not exist" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.742507 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f0ea5ed8-e2bb-461c-9541-4e04e899684c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.742557 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ea5ed8-e2bb-461c-9541-4e04e899684c-logs\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.742585 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxr5q\" (UniqueName: \"kubernetes.io/projected/f0ea5ed8-e2bb-461c-9541-4e04e899684c-kube-api-access-mxr5q\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.742633 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0ea5ed8-e2bb-461c-9541-4e04e899684c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.742684 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ea5ed8-e2bb-461c-9541-4e04e899684c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.742824 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ea5ed8-e2bb-461c-9541-4e04e899684c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.742862 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.742892 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0ea5ed8-e2bb-461c-9541-4e04e899684c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.743008 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ea5ed8-e2bb-461c-9541-4e04e899684c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.845070 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f0ea5ed8-e2bb-461c-9541-4e04e899684c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.845482 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ea5ed8-e2bb-461c-9541-4e04e899684c-logs\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.845911 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxr5q\" (UniqueName: \"kubernetes.io/projected/f0ea5ed8-e2bb-461c-9541-4e04e899684c-kube-api-access-mxr5q\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.846014 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ea5ed8-e2bb-461c-9541-4e04e899684c-logs\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.846476 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0ea5ed8-e2bb-461c-9541-4e04e899684c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.846876 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0ea5ed8-e2bb-461c-9541-4e04e899684c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.847211 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ea5ed8-e2bb-461c-9541-4e04e899684c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.847422 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ea5ed8-e2bb-461c-9541-4e04e899684c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.849061 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.849124 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0ea5ed8-e2bb-461c-9541-4e04e899684c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.849176 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ea5ed8-e2bb-461c-9541-4e04e899684c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.849395 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.852197 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f0ea5ed8-e2bb-461c-9541-4e04e899684c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.856570 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ea5ed8-e2bb-461c-9541-4e04e899684c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.859444 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ea5ed8-e2bb-461c-9541-4e04e899684c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.860839 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ea5ed8-e2bb-461c-9541-4e04e899684c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.861558 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0ea5ed8-e2bb-461c-9541-4e04e899684c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.865804 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxr5q\" (UniqueName: \"kubernetes.io/projected/f0ea5ed8-e2bb-461c-9541-4e04e899684c-kube-api-access-mxr5q\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.909149 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"f0ea5ed8-e2bb-461c-9541-4e04e899684c\") " pod="openstack/glance-default-internal-api-0" Oct 04 03:31:19 crc kubenswrapper[4964]: I1004 03:31:19.942462 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:20 crc kubenswrapper[4964]: I1004 03:31:20.540981 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36e18875-3436-4465-85fc-f0a240394665","Type":"ContainerStarted","Data":"8db012f95d151ea8890c4b4312ad74bc14d790057fcd478ba211316e1b23344e"} Oct 04 03:31:20 crc kubenswrapper[4964]: I1004 03:31:20.677484 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 03:31:20 crc kubenswrapper[4964]: I1004 03:31:20.858257 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:31:20 crc kubenswrapper[4964]: E1004 03:31:20.858786 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:31:20 crc kubenswrapper[4964]: I1004 03:31:20.876772 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dacfd57-acf5-4306-9488-f1130ccc0689" path="/var/lib/kubelet/pods/8dacfd57-acf5-4306-9488-f1130ccc0689/volumes" Oct 04 03:31:21 crc kubenswrapper[4964]: I1004 03:31:21.552283 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36e18875-3436-4465-85fc-f0a240394665","Type":"ContainerStarted","Data":"75684da329f1b159fdef2014be087c911c245c217282098a2a2c0d729cf94783"} Oct 04 03:31:23 crc kubenswrapper[4964]: I1004 03:31:23.088243 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 04 03:31:23 crc kubenswrapper[4964]: I1004 03:31:23.200760 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 04 03:31:23 crc kubenswrapper[4964]: I1004 03:31:23.308689 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 04 03:31:23 crc kubenswrapper[4964]: I1004 03:31:23.380155 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-4c88-account-create-2d7xt"] Oct 04 03:31:23 crc kubenswrapper[4964]: I1004 03:31:23.381198 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4c88-account-create-2d7xt" Oct 04 03:31:23 crc kubenswrapper[4964]: I1004 03:31:23.382762 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 04 03:31:23 crc kubenswrapper[4964]: I1004 03:31:23.394480 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-4c88-account-create-2d7xt"] Oct 04 03:31:23 crc kubenswrapper[4964]: I1004 03:31:23.497022 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mlnc\" (UniqueName: \"kubernetes.io/projected/2c65481e-f1b3-4877-aaea-cdbb32849b0a-kube-api-access-8mlnc\") pod \"manila-4c88-account-create-2d7xt\" (UID: \"2c65481e-f1b3-4877-aaea-cdbb32849b0a\") " pod="openstack/manila-4c88-account-create-2d7xt" Oct 04 03:31:23 crc kubenswrapper[4964]: I1004 03:31:23.599008 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mlnc\" (UniqueName: \"kubernetes.io/projected/2c65481e-f1b3-4877-aaea-cdbb32849b0a-kube-api-access-8mlnc\") pod \"manila-4c88-account-create-2d7xt\" (UID: \"2c65481e-f1b3-4877-aaea-cdbb32849b0a\") " pod="openstack/manila-4c88-account-create-2d7xt" Oct 04 03:31:23 crc kubenswrapper[4964]: I1004 03:31:23.626117 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mlnc\" (UniqueName: \"kubernetes.io/projected/2c65481e-f1b3-4877-aaea-cdbb32849b0a-kube-api-access-8mlnc\") pod \"manila-4c88-account-create-2d7xt\" (UID: \"2c65481e-f1b3-4877-aaea-cdbb32849b0a\") " pod="openstack/manila-4c88-account-create-2d7xt" Oct 04 03:31:23 crc kubenswrapper[4964]: I1004 03:31:23.712178 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4c88-account-create-2d7xt" Oct 04 03:31:25 crc kubenswrapper[4964]: I1004 03:31:25.607290 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f0ea5ed8-e2bb-461c-9541-4e04e899684c","Type":"ContainerStarted","Data":"f72f50beb6df288cd404e74534502b66a8c4181d33c4d06c0b6a8cf4e20a31ef"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.046068 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-4c88-account-create-2d7xt"] Oct 04 03:31:26 crc kubenswrapper[4964]: W1004 03:31:26.069508 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c65481e_f1b3_4877_aaea_cdbb32849b0a.slice/crio-90222000faa8547887374e23639a29f046e8af410db4fbed7864f5ad82ee40d7 WatchSource:0}: Error finding container 90222000faa8547887374e23639a29f046e8af410db4fbed7864f5ad82ee40d7: Status 404 returned error can't find the container with id 90222000faa8547887374e23639a29f046e8af410db4fbed7864f5ad82ee40d7 Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.625045 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f9d99668-zvfzz" event={"ID":"39f65132-4f7b-4c79-ba9b-e86c15ec60d6","Type":"ContainerStarted","Data":"700bbb424a3a19fdd292e74d0b091783be295edfed75ff4ba7d6a90406e9aae7"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.625317 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f9d99668-zvfzz" event={"ID":"39f65132-4f7b-4c79-ba9b-e86c15ec60d6","Type":"ContainerStarted","Data":"1b80b00f02255a1b8f409b96d571bd146d62b189fe33fa93d372e38fd6b19269"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.634883 4964 generic.go:334] "Generic (PLEG): container finished" podID="2c65481e-f1b3-4877-aaea-cdbb32849b0a" containerID="fe0cad209f2d6eeebc1aaa5a4feffecdafea16bb39eee03ce7b50ab84f12e85d" exitCode=0 Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.634951 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4c88-account-create-2d7xt" event={"ID":"2c65481e-f1b3-4877-aaea-cdbb32849b0a","Type":"ContainerDied","Data":"fe0cad209f2d6eeebc1aaa5a4feffecdafea16bb39eee03ce7b50ab84f12e85d"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.634977 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4c88-account-create-2d7xt" event={"ID":"2c65481e-f1b3-4877-aaea-cdbb32849b0a","Type":"ContainerStarted","Data":"90222000faa8547887374e23639a29f046e8af410db4fbed7864f5ad82ee40d7"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.637491 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67d99bc788-zm78q" event={"ID":"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc","Type":"ContainerStarted","Data":"ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.637518 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67d99bc788-zm78q" event={"ID":"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc","Type":"ContainerStarted","Data":"75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.648147 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36e18875-3436-4465-85fc-f0a240394665","Type":"ContainerStarted","Data":"5b6057600e04b5700326b295bc69b4cf2387589c2682687ec24e4157cc4655e5"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.652164 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-64f9d99668-zvfzz" podStartSLOduration=3.02291271 podStartE2EDuration="11.65214672s" podCreationTimestamp="2025-10-04 03:31:15 +0000 UTC" firstStartedPulling="2025-10-04 03:31:17.049822916 +0000 UTC m=+3056.946781554" lastFinishedPulling="2025-10-04 03:31:25.679056926 +0000 UTC m=+3065.576015564" observedRunningTime="2025-10-04 03:31:26.645058942 +0000 UTC m=+3066.542017580" watchObservedRunningTime="2025-10-04 03:31:26.65214672 +0000 UTC m=+3066.549105358" Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.681194 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67d99bc788-zm78q" podStartSLOduration=2.83034698 podStartE2EDuration="11.681172743s" podCreationTimestamp="2025-10-04 03:31:15 +0000 UTC" firstStartedPulling="2025-10-04 03:31:16.838970779 +0000 UTC m=+3056.735929417" lastFinishedPulling="2025-10-04 03:31:25.689796542 +0000 UTC m=+3065.586755180" observedRunningTime="2025-10-04 03:31:26.674759723 +0000 UTC m=+3066.571718371" watchObservedRunningTime="2025-10-04 03:31:26.681172743 +0000 UTC m=+3066.578131381" Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.685461 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzczl" event={"ID":"944df5c7-1a65-4893-bc06-306422fcb360","Type":"ContainerStarted","Data":"380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.691680 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6554d5fc67-d5vgm" event={"ID":"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d","Type":"ContainerStarted","Data":"f9dba9d2f9e7012d4bbce2855cf56cc639bc92f129ebbfe7db2e8843f3dfc6f1"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.691728 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6554d5fc67-d5vgm" event={"ID":"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d","Type":"ContainerStarted","Data":"d40fffc9c4239f14b7da2efa8b90575b3f120bb991d1de67ba17d642fc751516"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.691848 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6554d5fc67-d5vgm" podUID="5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" containerName="horizon-log" containerID="cri-o://d40fffc9c4239f14b7da2efa8b90575b3f120bb991d1de67ba17d642fc751516" gracePeriod=30 Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.692095 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6554d5fc67-d5vgm" podUID="5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" containerName="horizon" containerID="cri-o://f9dba9d2f9e7012d4bbce2855cf56cc639bc92f129ebbfe7db2e8843f3dfc6f1" gracePeriod=30 Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.695326 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f0ea5ed8-e2bb-461c-9541-4e04e899684c","Type":"ContainerStarted","Data":"96a3297885ba16eb706594951d2f343ef9acf457b8e40d1ef38839e5664b31b0"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.706678 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.70665739 podStartE2EDuration="8.70665739s" podCreationTimestamp="2025-10-04 03:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:31:26.695850223 +0000 UTC m=+3066.592808861" watchObservedRunningTime="2025-10-04 03:31:26.70665739 +0000 UTC m=+3066.603616028" Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.706974 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66574674fc-j28wh" event={"ID":"5445c577-835c-432b-88da-5fe7a9107cac","Type":"ContainerStarted","Data":"8783667709de39284bf8a0978d3e5305ddb1bbe37e2e1b4d4326bad59c79ca5b"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.707018 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66574674fc-j28wh" event={"ID":"5445c577-835c-432b-88da-5fe7a9107cac","Type":"ContainerStarted","Data":"3d537acce4c0d406a39324f85e4cce0afb66f09ae91bbb7394f6c50a50f4c3bc"} Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.707139 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66574674fc-j28wh" podUID="5445c577-835c-432b-88da-5fe7a9107cac" containerName="horizon-log" containerID="cri-o://3d537acce4c0d406a39324f85e4cce0afb66f09ae91bbb7394f6c50a50f4c3bc" gracePeriod=30 Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.707159 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66574674fc-j28wh" podUID="5445c577-835c-432b-88da-5fe7a9107cac" containerName="horizon" containerID="cri-o://8783667709de39284bf8a0978d3e5305ddb1bbe37e2e1b4d4326bad59c79ca5b" gracePeriod=30 Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.717149 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6554d5fc67-d5vgm" podStartSLOduration=2.636103695 podStartE2EDuration="13.717131589s" podCreationTimestamp="2025-10-04 03:31:13 +0000 UTC" firstStartedPulling="2025-10-04 03:31:14.580413094 +0000 UTC m=+3054.477371732" lastFinishedPulling="2025-10-04 03:31:25.661440998 +0000 UTC m=+3065.558399626" observedRunningTime="2025-10-04 03:31:26.713499932 +0000 UTC m=+3066.610458570" watchObservedRunningTime="2025-10-04 03:31:26.717131589 +0000 UTC m=+3066.614090227" Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.732478 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lzczl" podStartSLOduration=3.702628123 podStartE2EDuration="11.732455346s" podCreationTimestamp="2025-10-04 03:31:15 +0000 UTC" firstStartedPulling="2025-10-04 03:31:17.468295363 +0000 UTC m=+3057.365254001" lastFinishedPulling="2025-10-04 03:31:25.498122586 +0000 UTC m=+3065.395081224" observedRunningTime="2025-10-04 03:31:26.729326253 +0000 UTC m=+3066.626284891" watchObservedRunningTime="2025-10-04 03:31:26.732455346 +0000 UTC m=+3066.629413984" Oct 04 03:31:26 crc kubenswrapper[4964]: I1004 03:31:26.751965 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-66574674fc-j28wh" podStartSLOduration=2.438747876 podStartE2EDuration="13.751950214s" podCreationTimestamp="2025-10-04 03:31:13 +0000 UTC" firstStartedPulling="2025-10-04 03:31:14.364537854 +0000 UTC m=+3054.261496492" lastFinishedPulling="2025-10-04 03:31:25.677740192 +0000 UTC m=+3065.574698830" observedRunningTime="2025-10-04 03:31:26.750488366 +0000 UTC m=+3066.647447034" watchObservedRunningTime="2025-10-04 03:31:26.751950214 +0000 UTC m=+3066.648908852" Oct 04 03:31:27 crc kubenswrapper[4964]: I1004 03:31:27.720941 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f0ea5ed8-e2bb-461c-9541-4e04e899684c","Type":"ContainerStarted","Data":"2d7fb348fa45d35550736791e868ba501cf757e40714aef8984cfea8cd410541"} Oct 04 03:31:27 crc kubenswrapper[4964]: I1004 03:31:27.750162 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.750106904999999 podStartE2EDuration="8.750106905s" podCreationTimestamp="2025-10-04 03:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:31:27.747537177 +0000 UTC m=+3067.644495855" watchObservedRunningTime="2025-10-04 03:31:27.750106905 +0000 UTC m=+3067.647065543" Oct 04 03:31:28 crc kubenswrapper[4964]: I1004 03:31:28.158742 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4c88-account-create-2d7xt" Oct 04 03:31:28 crc kubenswrapper[4964]: I1004 03:31:28.210758 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mlnc\" (UniqueName: \"kubernetes.io/projected/2c65481e-f1b3-4877-aaea-cdbb32849b0a-kube-api-access-8mlnc\") pod \"2c65481e-f1b3-4877-aaea-cdbb32849b0a\" (UID: \"2c65481e-f1b3-4877-aaea-cdbb32849b0a\") " Oct 04 03:31:28 crc kubenswrapper[4964]: I1004 03:31:28.216866 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c65481e-f1b3-4877-aaea-cdbb32849b0a-kube-api-access-8mlnc" (OuterVolumeSpecName: "kube-api-access-8mlnc") pod "2c65481e-f1b3-4877-aaea-cdbb32849b0a" (UID: "2c65481e-f1b3-4877-aaea-cdbb32849b0a"). InnerVolumeSpecName "kube-api-access-8mlnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:31:28 crc kubenswrapper[4964]: I1004 03:31:28.312778 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mlnc\" (UniqueName: \"kubernetes.io/projected/2c65481e-f1b3-4877-aaea-cdbb32849b0a-kube-api-access-8mlnc\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:28 crc kubenswrapper[4964]: I1004 03:31:28.736734 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4c88-account-create-2d7xt" Oct 04 03:31:28 crc kubenswrapper[4964]: I1004 03:31:28.736754 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4c88-account-create-2d7xt" event={"ID":"2c65481e-f1b3-4877-aaea-cdbb32849b0a","Type":"ContainerDied","Data":"90222000faa8547887374e23639a29f046e8af410db4fbed7864f5ad82ee40d7"} Oct 04 03:31:28 crc kubenswrapper[4964]: I1004 03:31:28.737101 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90222000faa8547887374e23639a29f046e8af410db4fbed7864f5ad82ee40d7" Oct 04 03:31:28 crc kubenswrapper[4964]: I1004 03:31:28.969642 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 04 03:31:28 crc kubenswrapper[4964]: I1004 03:31:28.969726 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 04 03:31:29 crc kubenswrapper[4964]: I1004 03:31:29.038086 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 04 03:31:29 crc kubenswrapper[4964]: I1004 03:31:29.065561 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 04 03:31:29 crc kubenswrapper[4964]: I1004 03:31:29.752045 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 04 03:31:29 crc kubenswrapper[4964]: I1004 03:31:29.752088 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 04 03:31:29 crc kubenswrapper[4964]: I1004 03:31:29.943219 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:29 crc kubenswrapper[4964]: I1004 03:31:29.943513 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:29 crc kubenswrapper[4964]: I1004 03:31:29.980042 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:29 crc kubenswrapper[4964]: I1004 03:31:29.993975 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:30 crc kubenswrapper[4964]: I1004 03:31:30.762073 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:30 crc kubenswrapper[4964]: I1004 03:31:30.762134 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.349210 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.688450 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.688496 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.752070 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-fqh7q"] Oct 04 03:31:33 crc kubenswrapper[4964]: E1004 03:31:33.752528 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c65481e-f1b3-4877-aaea-cdbb32849b0a" containerName="mariadb-account-create" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.752546 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c65481e-f1b3-4877-aaea-cdbb32849b0a" containerName="mariadb-account-create" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.752839 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c65481e-f1b3-4877-aaea-cdbb32849b0a" containerName="mariadb-account-create" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.753584 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.756875 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-hv6p8" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.762061 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.770134 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-fqh7q"] Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.840974 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkgqg\" (UniqueName: \"kubernetes.io/projected/47754785-d3d3-461f-a710-cb74b48b1a3e-kube-api-access-nkgqg\") pod \"manila-db-sync-fqh7q\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.841030 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-config-data\") pod \"manila-db-sync-fqh7q\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.841079 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-combined-ca-bundle\") pod \"manila-db-sync-fqh7q\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.841331 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-job-config-data\") pod \"manila-db-sync-fqh7q\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.917480 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.943114 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-job-config-data\") pod \"manila-db-sync-fqh7q\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.943331 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkgqg\" (UniqueName: \"kubernetes.io/projected/47754785-d3d3-461f-a710-cb74b48b1a3e-kube-api-access-nkgqg\") pod \"manila-db-sync-fqh7q\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.943391 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-config-data\") pod \"manila-db-sync-fqh7q\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.943490 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-combined-ca-bundle\") pod \"manila-db-sync-fqh7q\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.967207 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-config-data\") pod \"manila-db-sync-fqh7q\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.969184 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-combined-ca-bundle\") pod \"manila-db-sync-fqh7q\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.972327 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-job-config-data\") pod \"manila-db-sync-fqh7q\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:33 crc kubenswrapper[4964]: I1004 03:31:33.973684 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkgqg\" (UniqueName: \"kubernetes.io/projected/47754785-d3d3-461f-a710-cb74b48b1a3e-kube-api-access-nkgqg\") pod \"manila-db-sync-fqh7q\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:34 crc kubenswrapper[4964]: I1004 03:31:34.024007 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:34 crc kubenswrapper[4964]: I1004 03:31:34.089209 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:34 crc kubenswrapper[4964]: I1004 03:31:34.712665 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-fqh7q"] Oct 04 03:31:34 crc kubenswrapper[4964]: I1004 03:31:34.731649 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 03:31:34 crc kubenswrapper[4964]: I1004 03:31:34.821003 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fqh7q" event={"ID":"47754785-d3d3-461f-a710-cb74b48b1a3e","Type":"ContainerStarted","Data":"e4b9d8b178db11a4bde614eb37456e036c73531e29120e1362a45d647f8e50d6"} Oct 04 03:31:35 crc kubenswrapper[4964]: I1004 03:31:35.391221 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 04 03:31:35 crc kubenswrapper[4964]: I1004 03:31:35.822097 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:35 crc kubenswrapper[4964]: I1004 03:31:35.822153 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:35 crc kubenswrapper[4964]: I1004 03:31:35.845750 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:31:35 crc kubenswrapper[4964]: E1004 03:31:35.846003 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:31:35 crc kubenswrapper[4964]: I1004 03:31:35.872275 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:36 crc kubenswrapper[4964]: I1004 03:31:36.162254 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:36 crc kubenswrapper[4964]: I1004 03:31:36.163306 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:36 crc kubenswrapper[4964]: I1004 03:31:36.165688 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67d99bc788-zm78q" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.245:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.245:8443: connect: connection refused" Oct 04 03:31:36 crc kubenswrapper[4964]: I1004 03:31:36.392804 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:36 crc kubenswrapper[4964]: I1004 03:31:36.393525 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:36 crc kubenswrapper[4964]: I1004 03:31:36.394079 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64f9d99668-zvfzz" podUID="39f65132-4f7b-4c79-ba9b-e86c15ec60d6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Oct 04 03:31:36 crc kubenswrapper[4964]: I1004 03:31:36.879955 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:36 crc kubenswrapper[4964]: I1004 03:31:36.933308 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lzczl"] Oct 04 03:31:38 crc kubenswrapper[4964]: I1004 03:31:38.867111 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lzczl" podUID="944df5c7-1a65-4893-bc06-306422fcb360" containerName="registry-server" containerID="cri-o://380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2" gracePeriod=2 Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.533385 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.688867 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944df5c7-1a65-4893-bc06-306422fcb360-utilities\") pod \"944df5c7-1a65-4893-bc06-306422fcb360\" (UID: \"944df5c7-1a65-4893-bc06-306422fcb360\") " Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.688988 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944df5c7-1a65-4893-bc06-306422fcb360-catalog-content\") pod \"944df5c7-1a65-4893-bc06-306422fcb360\" (UID: \"944df5c7-1a65-4893-bc06-306422fcb360\") " Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.689142 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfhfc\" (UniqueName: \"kubernetes.io/projected/944df5c7-1a65-4893-bc06-306422fcb360-kube-api-access-tfhfc\") pod \"944df5c7-1a65-4893-bc06-306422fcb360\" (UID: \"944df5c7-1a65-4893-bc06-306422fcb360\") " Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.689745 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/944df5c7-1a65-4893-bc06-306422fcb360-utilities" (OuterVolumeSpecName: "utilities") pod "944df5c7-1a65-4893-bc06-306422fcb360" (UID: "944df5c7-1a65-4893-bc06-306422fcb360"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.694482 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/944df5c7-1a65-4893-bc06-306422fcb360-kube-api-access-tfhfc" (OuterVolumeSpecName: "kube-api-access-tfhfc") pod "944df5c7-1a65-4893-bc06-306422fcb360" (UID: "944df5c7-1a65-4893-bc06-306422fcb360"). InnerVolumeSpecName "kube-api-access-tfhfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.730662 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/944df5c7-1a65-4893-bc06-306422fcb360-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "944df5c7-1a65-4893-bc06-306422fcb360" (UID: "944df5c7-1a65-4893-bc06-306422fcb360"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.791848 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944df5c7-1a65-4893-bc06-306422fcb360-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.792097 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfhfc\" (UniqueName: \"kubernetes.io/projected/944df5c7-1a65-4893-bc06-306422fcb360-kube-api-access-tfhfc\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.792113 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944df5c7-1a65-4893-bc06-306422fcb360-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.879747 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fqh7q" event={"ID":"47754785-d3d3-461f-a710-cb74b48b1a3e","Type":"ContainerStarted","Data":"04e69927611ac904588a58a6ffd367f6ef1f5b2c045116cd182b6a84c21945d9"} Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.884337 4964 generic.go:334] "Generic (PLEG): container finished" podID="944df5c7-1a65-4893-bc06-306422fcb360" containerID="380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2" exitCode=0 Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.884372 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzczl" event={"ID":"944df5c7-1a65-4893-bc06-306422fcb360","Type":"ContainerDied","Data":"380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2"} Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.884394 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzczl" event={"ID":"944df5c7-1a65-4893-bc06-306422fcb360","Type":"ContainerDied","Data":"a5092033c369419be2fef1c364bad892be95c682e327340e530edb4d078afeec"} Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.884416 4964 scope.go:117] "RemoveContainer" containerID="380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2" Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.884539 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lzczl" Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.900356 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-fqh7q" podStartSLOduration=2.370755789 podStartE2EDuration="6.90033772s" podCreationTimestamp="2025-10-04 03:31:33 +0000 UTC" firstStartedPulling="2025-10-04 03:31:34.731423168 +0000 UTC m=+3074.628381806" lastFinishedPulling="2025-10-04 03:31:39.261005069 +0000 UTC m=+3079.157963737" observedRunningTime="2025-10-04 03:31:39.895803779 +0000 UTC m=+3079.792762427" watchObservedRunningTime="2025-10-04 03:31:39.90033772 +0000 UTC m=+3079.797296368" Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.950236 4964 scope.go:117] "RemoveContainer" containerID="53fb14210708a4b01c6c5c7d019000a33646571405556fe915fad1560db977f4" Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.951440 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lzczl"] Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.961973 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lzczl"] Oct 04 03:31:39 crc kubenswrapper[4964]: I1004 03:31:39.982396 4964 scope.go:117] "RemoveContainer" containerID="61fb44aa24560f419384d8cc3d72d6dd6ab1d86780e207e42935c04c7fe8df2b" Oct 04 03:31:40 crc kubenswrapper[4964]: I1004 03:31:40.016311 4964 scope.go:117] "RemoveContainer" containerID="380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2" Oct 04 03:31:40 crc kubenswrapper[4964]: E1004 03:31:40.016876 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2\": container with ID starting with 380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2 not found: ID does not exist" containerID="380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2" Oct 04 03:31:40 crc kubenswrapper[4964]: I1004 03:31:40.016910 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2"} err="failed to get container status \"380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2\": rpc error: code = NotFound desc = could not find container \"380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2\": container with ID starting with 380e8b94b5367f94de95330413fd40b46ff2fe24db63adbfd1346fdf629343f2 not found: ID does not exist" Oct 04 03:31:40 crc kubenswrapper[4964]: I1004 03:31:40.016935 4964 scope.go:117] "RemoveContainer" containerID="53fb14210708a4b01c6c5c7d019000a33646571405556fe915fad1560db977f4" Oct 04 03:31:40 crc kubenswrapper[4964]: E1004 03:31:40.017377 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53fb14210708a4b01c6c5c7d019000a33646571405556fe915fad1560db977f4\": container with ID starting with 53fb14210708a4b01c6c5c7d019000a33646571405556fe915fad1560db977f4 not found: ID does not exist" containerID="53fb14210708a4b01c6c5c7d019000a33646571405556fe915fad1560db977f4" Oct 04 03:31:40 crc kubenswrapper[4964]: I1004 03:31:40.017422 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53fb14210708a4b01c6c5c7d019000a33646571405556fe915fad1560db977f4"} err="failed to get container status \"53fb14210708a4b01c6c5c7d019000a33646571405556fe915fad1560db977f4\": rpc error: code = NotFound desc = could not find container \"53fb14210708a4b01c6c5c7d019000a33646571405556fe915fad1560db977f4\": container with ID starting with 53fb14210708a4b01c6c5c7d019000a33646571405556fe915fad1560db977f4 not found: ID does not exist" Oct 04 03:31:40 crc kubenswrapper[4964]: I1004 03:31:40.017456 4964 scope.go:117] "RemoveContainer" containerID="61fb44aa24560f419384d8cc3d72d6dd6ab1d86780e207e42935c04c7fe8df2b" Oct 04 03:31:40 crc kubenswrapper[4964]: E1004 03:31:40.017839 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61fb44aa24560f419384d8cc3d72d6dd6ab1d86780e207e42935c04c7fe8df2b\": container with ID starting with 61fb44aa24560f419384d8cc3d72d6dd6ab1d86780e207e42935c04c7fe8df2b not found: ID does not exist" containerID="61fb44aa24560f419384d8cc3d72d6dd6ab1d86780e207e42935c04c7fe8df2b" Oct 04 03:31:40 crc kubenswrapper[4964]: I1004 03:31:40.017870 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61fb44aa24560f419384d8cc3d72d6dd6ab1d86780e207e42935c04c7fe8df2b"} err="failed to get container status \"61fb44aa24560f419384d8cc3d72d6dd6ab1d86780e207e42935c04c7fe8df2b\": rpc error: code = NotFound desc = could not find container \"61fb44aa24560f419384d8cc3d72d6dd6ab1d86780e207e42935c04c7fe8df2b\": container with ID starting with 61fb44aa24560f419384d8cc3d72d6dd6ab1d86780e207e42935c04c7fe8df2b not found: ID does not exist" Oct 04 03:31:40 crc kubenswrapper[4964]: I1004 03:31:40.860740 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="944df5c7-1a65-4893-bc06-306422fcb360" path="/var/lib/kubelet/pods/944df5c7-1a65-4893-bc06-306422fcb360/volumes" Oct 04 03:31:46 crc kubenswrapper[4964]: I1004 03:31:46.846153 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:31:46 crc kubenswrapper[4964]: E1004 03:31:46.847273 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:31:47 crc kubenswrapper[4964]: I1004 03:31:47.947061 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:48 crc kubenswrapper[4964]: I1004 03:31:48.207214 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:48 crc kubenswrapper[4964]: I1004 03:31:48.980240 4964 generic.go:334] "Generic (PLEG): container finished" podID="47754785-d3d3-461f-a710-cb74b48b1a3e" containerID="04e69927611ac904588a58a6ffd367f6ef1f5b2c045116cd182b6a84c21945d9" exitCode=0 Oct 04 03:31:48 crc kubenswrapper[4964]: I1004 03:31:48.980307 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fqh7q" event={"ID":"47754785-d3d3-461f-a710-cb74b48b1a3e","Type":"ContainerDied","Data":"04e69927611ac904588a58a6ffd367f6ef1f5b2c045116cd182b6a84c21945d9"} Oct 04 03:31:49 crc kubenswrapper[4964]: I1004 03:31:49.631762 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:31:49 crc kubenswrapper[4964]: I1004 03:31:49.885706 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-64f9d99668-zvfzz" Oct 04 03:31:49 crc kubenswrapper[4964]: I1004 03:31:49.954260 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67d99bc788-zm78q"] Oct 04 03:31:49 crc kubenswrapper[4964]: I1004 03:31:49.988318 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67d99bc788-zm78q" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerName="horizon-log" containerID="cri-o://75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262" gracePeriod=30 Oct 04 03:31:49 crc kubenswrapper[4964]: I1004 03:31:49.988367 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67d99bc788-zm78q" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerName="horizon" containerID="cri-o://ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7" gracePeriod=30 Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.418435 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.531883 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-combined-ca-bundle\") pod \"47754785-d3d3-461f-a710-cb74b48b1a3e\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.532032 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-config-data\") pod \"47754785-d3d3-461f-a710-cb74b48b1a3e\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.532075 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkgqg\" (UniqueName: \"kubernetes.io/projected/47754785-d3d3-461f-a710-cb74b48b1a3e-kube-api-access-nkgqg\") pod \"47754785-d3d3-461f-a710-cb74b48b1a3e\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.532148 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-job-config-data\") pod \"47754785-d3d3-461f-a710-cb74b48b1a3e\" (UID: \"47754785-d3d3-461f-a710-cb74b48b1a3e\") " Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.537809 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "47754785-d3d3-461f-a710-cb74b48b1a3e" (UID: "47754785-d3d3-461f-a710-cb74b48b1a3e"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.537822 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47754785-d3d3-461f-a710-cb74b48b1a3e-kube-api-access-nkgqg" (OuterVolumeSpecName: "kube-api-access-nkgqg") pod "47754785-d3d3-461f-a710-cb74b48b1a3e" (UID: "47754785-d3d3-461f-a710-cb74b48b1a3e"). InnerVolumeSpecName "kube-api-access-nkgqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.541412 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-config-data" (OuterVolumeSpecName: "config-data") pod "47754785-d3d3-461f-a710-cb74b48b1a3e" (UID: "47754785-d3d3-461f-a710-cb74b48b1a3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.560968 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47754785-d3d3-461f-a710-cb74b48b1a3e" (UID: "47754785-d3d3-461f-a710-cb74b48b1a3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.634896 4964 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.634944 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.634964 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47754785-d3d3-461f-a710-cb74b48b1a3e-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:50 crc kubenswrapper[4964]: I1004 03:31:50.634982 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkgqg\" (UniqueName: \"kubernetes.io/projected/47754785-d3d3-461f-a710-cb74b48b1a3e-kube-api-access-nkgqg\") on node \"crc\" DevicePath \"\"" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.002009 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fqh7q" event={"ID":"47754785-d3d3-461f-a710-cb74b48b1a3e","Type":"ContainerDied","Data":"e4b9d8b178db11a4bde614eb37456e036c73531e29120e1362a45d647f8e50d6"} Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.002052 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4b9d8b178db11a4bde614eb37456e036c73531e29120e1362a45d647f8e50d6" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.002813 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fqh7q" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.437789 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 04 03:31:51 crc kubenswrapper[4964]: E1004 03:31:51.438562 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944df5c7-1a65-4893-bc06-306422fcb360" containerName="extract-utilities" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.438592 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="944df5c7-1a65-4893-bc06-306422fcb360" containerName="extract-utilities" Oct 04 03:31:51 crc kubenswrapper[4964]: E1004 03:31:51.438651 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47754785-d3d3-461f-a710-cb74b48b1a3e" containerName="manila-db-sync" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.438661 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="47754785-d3d3-461f-a710-cb74b48b1a3e" containerName="manila-db-sync" Oct 04 03:31:51 crc kubenswrapper[4964]: E1004 03:31:51.438676 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944df5c7-1a65-4893-bc06-306422fcb360" containerName="registry-server" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.438686 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="944df5c7-1a65-4893-bc06-306422fcb360" containerName="registry-server" Oct 04 03:31:51 crc kubenswrapper[4964]: E1004 03:31:51.438702 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944df5c7-1a65-4893-bc06-306422fcb360" containerName="extract-content" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.438710 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="944df5c7-1a65-4893-bc06-306422fcb360" containerName="extract-content" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.438946 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="47754785-d3d3-461f-a710-cb74b48b1a3e" containerName="manila-db-sync" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.438977 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="944df5c7-1a65-4893-bc06-306422fcb360" containerName="registry-server" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.440205 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.443966 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.444593 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-hv6p8" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.444895 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.445046 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.458035 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.459837 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.469605 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.477697 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-wn8qh"] Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.479264 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.493137 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.524445 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.541493 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-wn8qh"] Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.556185 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.556374 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.556441 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.556507 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-config-data\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.556647 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-scripts\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.556733 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.556849 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-config-data\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.556909 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.556981 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-scripts\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.557051 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9wl9\" (UniqueName: \"kubernetes.io/projected/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-kube-api-access-f9wl9\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.557127 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd656502-4058-4689-9474-b5b16bed8695-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.557185 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7497\" (UniqueName: \"kubernetes.io/projected/bd656502-4058-4689-9474-b5b16bed8695-kube-api-access-j7497\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.557254 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-ceph\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.557319 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.651348 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.653076 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.655722 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658386 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658552 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658588 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mllrk\" (UniqueName: \"kubernetes.io/projected/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-kube-api-access-mllrk\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658612 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658650 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658672 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658693 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-config-data\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658735 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658771 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-scripts\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658794 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-config\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658834 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658896 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-config-data\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658922 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658946 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-scripts\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658962 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9wl9\" (UniqueName: \"kubernetes.io/projected/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-kube-api-access-f9wl9\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.658989 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd656502-4058-4689-9474-b5b16bed8695-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.659004 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7497\" (UniqueName: \"kubernetes.io/projected/bd656502-4058-4689-9474-b5b16bed8695-kube-api-access-j7497\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.659022 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-ceph\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.659039 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.659074 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.659099 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.660740 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.663176 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.663363 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-config-data\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.663428 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.664547 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-scripts\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.664658 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.665075 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.665251 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd656502-4058-4689-9474-b5b16bed8695-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.669750 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-ceph\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.675024 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-config-data\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.675348 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.677379 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-scripts\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.687813 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9wl9\" (UniqueName: \"kubernetes.io/projected/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-kube-api-access-f9wl9\") pod \"manila-share-share1-0\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.690838 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7497\" (UniqueName: \"kubernetes.io/projected/bd656502-4058-4689-9474-b5b16bed8695-kube-api-access-j7497\") pod \"manila-scheduler-0\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.761781 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.762704 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.762653 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.762790 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.763535 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.763608 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-etc-machine-id\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.763740 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.763766 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mllrk\" (UniqueName: \"kubernetes.io/projected/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-kube-api-access-mllrk\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.764125 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.764903 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.764953 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-config\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.765030 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.765108 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-config\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.765137 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-config-data\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.765179 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-logs\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.765227 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-scripts\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.765273 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln24f\" (UniqueName: \"kubernetes.io/projected/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-kube-api-access-ln24f\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.765345 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-config-data-custom\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.765844 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.794504 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.835893 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mllrk\" (UniqueName: \"kubernetes.io/projected/8993dbbc-e0a2-46c8-b3e0-787dce0f121c-kube-api-access-mllrk\") pod \"dnsmasq-dns-76b5fdb995-wn8qh\" (UID: \"8993dbbc-e0a2-46c8-b3e0-787dce0f121c\") " pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.867730 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.867840 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-config-data\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.867901 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-logs\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.867989 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-scripts\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.868071 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln24f\" (UniqueName: \"kubernetes.io/projected/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-kube-api-access-ln24f\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.868153 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-config-data-custom\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.868319 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-etc-machine-id\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.868689 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-etc-machine-id\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.869392 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-logs\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.873801 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-config-data\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.875233 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-scripts\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.875427 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.884383 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-config-data-custom\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:51 crc kubenswrapper[4964]: I1004 03:31:51.887596 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln24f\" (UniqueName: \"kubernetes.io/projected/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-kube-api-access-ln24f\") pod \"manila-api-0\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " pod="openstack/manila-api-0" Oct 04 03:31:52 crc kubenswrapper[4964]: I1004 03:31:52.062190 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 04 03:31:52 crc kubenswrapper[4964]: I1004 03:31:52.109246 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:52 crc kubenswrapper[4964]: I1004 03:31:52.402892 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 04 03:31:52 crc kubenswrapper[4964]: W1004 03:31:52.406561 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd656502_4058_4689_9474_b5b16bed8695.slice/crio-c2d78ddde7c972d35c22ae43098996135a8aafa6cd325b7175776d2b861cf127 WatchSource:0}: Error finding container c2d78ddde7c972d35c22ae43098996135a8aafa6cd325b7175776d2b861cf127: Status 404 returned error can't find the container with id c2d78ddde7c972d35c22ae43098996135a8aafa6cd325b7175776d2b861cf127 Oct 04 03:31:52 crc kubenswrapper[4964]: I1004 03:31:52.414883 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 04 03:31:52 crc kubenswrapper[4964]: I1004 03:31:52.695800 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 04 03:31:52 crc kubenswrapper[4964]: I1004 03:31:52.703047 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-wn8qh"] Oct 04 03:31:53 crc kubenswrapper[4964]: I1004 03:31:53.034469 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4","Type":"ContainerStarted","Data":"1005c32b1d60e58922a23a170a7555226fc80a72dbfc2fa0f307d770d6dcf67b"} Oct 04 03:31:53 crc kubenswrapper[4964]: I1004 03:31:53.044601 4964 generic.go:334] "Generic (PLEG): container finished" podID="8993dbbc-e0a2-46c8-b3e0-787dce0f121c" containerID="5a5afc887a28b289e9d959c02e238ec55b788bc3b1227dbf4db2ab96c45af8d0" exitCode=0 Oct 04 03:31:53 crc kubenswrapper[4964]: I1004 03:31:53.044749 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" event={"ID":"8993dbbc-e0a2-46c8-b3e0-787dce0f121c","Type":"ContainerDied","Data":"5a5afc887a28b289e9d959c02e238ec55b788bc3b1227dbf4db2ab96c45af8d0"} Oct 04 03:31:53 crc kubenswrapper[4964]: I1004 03:31:53.044777 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" event={"ID":"8993dbbc-e0a2-46c8-b3e0-787dce0f121c","Type":"ContainerStarted","Data":"fd33f3eb7b4b7ae77e4c1421fda96e75cdec5b636e0b04de8c14f35997e896e8"} Oct 04 03:31:53 crc kubenswrapper[4964]: I1004 03:31:53.050110 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"bd656502-4058-4689-9474-b5b16bed8695","Type":"ContainerStarted","Data":"c2d78ddde7c972d35c22ae43098996135a8aafa6cd325b7175776d2b861cf127"} Oct 04 03:31:53 crc kubenswrapper[4964]: I1004 03:31:53.052256 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83","Type":"ContainerStarted","Data":"10381b37714f91b2b65e77d67706337ad6ed92442c519ee3dc040024d8436dae"} Oct 04 03:31:54 crc kubenswrapper[4964]: I1004 03:31:54.079095 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4","Type":"ContainerStarted","Data":"0ce8dbbb5dd7e3cca0a3e8c8932e3c5bad8003665614294f6aa1043c4fbd05e7"} Oct 04 03:31:54 crc kubenswrapper[4964]: I1004 03:31:54.079664 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4","Type":"ContainerStarted","Data":"e1ced1ecd5addab616a9d225147c8e026b5abeffa1dd03e7d25e65bfcd7e7086"} Oct 04 03:31:54 crc kubenswrapper[4964]: I1004 03:31:54.079682 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 04 03:31:54 crc kubenswrapper[4964]: I1004 03:31:54.090855 4964 generic.go:334] "Generic (PLEG): container finished" podID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerID="ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7" exitCode=0 Oct 04 03:31:54 crc kubenswrapper[4964]: I1004 03:31:54.090949 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67d99bc788-zm78q" event={"ID":"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc","Type":"ContainerDied","Data":"ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7"} Oct 04 03:31:54 crc kubenswrapper[4964]: I1004 03:31:54.094509 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" event={"ID":"8993dbbc-e0a2-46c8-b3e0-787dce0f121c","Type":"ContainerStarted","Data":"ebe04b6012e1cb0d34dd37233abd38813e528b8e16c99d06951997b6ae0fb3f6"} Oct 04 03:31:54 crc kubenswrapper[4964]: I1004 03:31:54.095480 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:31:54 crc kubenswrapper[4964]: I1004 03:31:54.101511 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.101498507 podStartE2EDuration="3.101498507s" podCreationTimestamp="2025-10-04 03:31:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:31:54.101118737 +0000 UTC m=+3093.998077375" watchObservedRunningTime="2025-10-04 03:31:54.101498507 +0000 UTC m=+3093.998457145" Oct 04 03:31:54 crc kubenswrapper[4964]: I1004 03:31:54.101946 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"bd656502-4058-4689-9474-b5b16bed8695","Type":"ContainerStarted","Data":"2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e"} Oct 04 03:31:54 crc kubenswrapper[4964]: I1004 03:31:54.138059 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" podStartSLOduration=3.138038018 podStartE2EDuration="3.138038018s" podCreationTimestamp="2025-10-04 03:31:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:31:54.125993908 +0000 UTC m=+3094.022952546" watchObservedRunningTime="2025-10-04 03:31:54.138038018 +0000 UTC m=+3094.034996656" Oct 04 03:31:54 crc kubenswrapper[4964]: I1004 03:31:54.149398 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 04 03:31:55 crc kubenswrapper[4964]: I1004 03:31:55.111932 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"bd656502-4058-4689-9474-b5b16bed8695","Type":"ContainerStarted","Data":"b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473"} Oct 04 03:31:55 crc kubenswrapper[4964]: I1004 03:31:55.129956 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.222318059 podStartE2EDuration="4.129941863s" podCreationTimestamp="2025-10-04 03:31:51 +0000 UTC" firstStartedPulling="2025-10-04 03:31:52.408524411 +0000 UTC m=+3092.305483049" lastFinishedPulling="2025-10-04 03:31:53.316148215 +0000 UTC m=+3093.213106853" observedRunningTime="2025-10-04 03:31:55.128532076 +0000 UTC m=+3095.025490704" watchObservedRunningTime="2025-10-04 03:31:55.129941863 +0000 UTC m=+3095.026900501" Oct 04 03:31:56 crc kubenswrapper[4964]: I1004 03:31:56.121943 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" containerName="manila-api-log" containerID="cri-o://e1ced1ecd5addab616a9d225147c8e026b5abeffa1dd03e7d25e65bfcd7e7086" gracePeriod=30 Oct 04 03:31:56 crc kubenswrapper[4964]: I1004 03:31:56.122132 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" containerName="manila-api" containerID="cri-o://0ce8dbbb5dd7e3cca0a3e8c8932e3c5bad8003665614294f6aa1043c4fbd05e7" gracePeriod=30 Oct 04 03:31:56 crc kubenswrapper[4964]: I1004 03:31:56.162976 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67d99bc788-zm78q" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.245:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.245:8443: connect: connection refused" Oct 04 03:31:56 crc kubenswrapper[4964]: I1004 03:31:56.250132 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 03:31:56 crc kubenswrapper[4964]: I1004 03:31:56.250647 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="ceilometer-central-agent" containerID="cri-o://b7423db947cba0f404cde65df2af840f7f0bee247ed7824d2aaa9a946367c368" gracePeriod=30 Oct 04 03:31:56 crc kubenswrapper[4964]: I1004 03:31:56.250761 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="proxy-httpd" containerID="cri-o://577d8dac90a628c448abf855afcc52a97c16028ad831736c1f330c9ba30b49df" gracePeriod=30 Oct 04 03:31:56 crc kubenswrapper[4964]: I1004 03:31:56.250844 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="ceilometer-notification-agent" containerID="cri-o://ebd282f42eb883e68535e463ecfb15bc70d23b212b1d750db8bb89675e00b115" gracePeriod=30 Oct 04 03:31:56 crc kubenswrapper[4964]: I1004 03:31:56.250902 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="sg-core" containerID="cri-o://beda92b6737d95754b37686a36cd2441d380f20fa24da514e8742cc4389d9723" gracePeriod=30 Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.139996 4964 generic.go:334] "Generic (PLEG): container finished" podID="69b09710-8018-47e1-9d2e-36df63451268" containerID="577d8dac90a628c448abf855afcc52a97c16028ad831736c1f330c9ba30b49df" exitCode=0 Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.140290 4964 generic.go:334] "Generic (PLEG): container finished" podID="69b09710-8018-47e1-9d2e-36df63451268" containerID="beda92b6737d95754b37686a36cd2441d380f20fa24da514e8742cc4389d9723" exitCode=2 Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.140300 4964 generic.go:334] "Generic (PLEG): container finished" podID="69b09710-8018-47e1-9d2e-36df63451268" containerID="b7423db947cba0f404cde65df2af840f7f0bee247ed7824d2aaa9a946367c368" exitCode=0 Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.140317 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69b09710-8018-47e1-9d2e-36df63451268","Type":"ContainerDied","Data":"577d8dac90a628c448abf855afcc52a97c16028ad831736c1f330c9ba30b49df"} Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.140345 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69b09710-8018-47e1-9d2e-36df63451268","Type":"ContainerDied","Data":"beda92b6737d95754b37686a36cd2441d380f20fa24da514e8742cc4389d9723"} Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.140356 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69b09710-8018-47e1-9d2e-36df63451268","Type":"ContainerDied","Data":"b7423db947cba0f404cde65df2af840f7f0bee247ed7824d2aaa9a946367c368"} Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.151278 4964 generic.go:334] "Generic (PLEG): container finished" podID="5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" containerID="f9dba9d2f9e7012d4bbce2855cf56cc639bc92f129ebbfe7db2e8843f3dfc6f1" exitCode=137 Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.151308 4964 generic.go:334] "Generic (PLEG): container finished" podID="5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" containerID="d40fffc9c4239f14b7da2efa8b90575b3f120bb991d1de67ba17d642fc751516" exitCode=137 Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.151346 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6554d5fc67-d5vgm" event={"ID":"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d","Type":"ContainerDied","Data":"f9dba9d2f9e7012d4bbce2855cf56cc639bc92f129ebbfe7db2e8843f3dfc6f1"} Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.151370 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6554d5fc67-d5vgm" event={"ID":"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d","Type":"ContainerDied","Data":"d40fffc9c4239f14b7da2efa8b90575b3f120bb991d1de67ba17d642fc751516"} Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.153131 4964 generic.go:334] "Generic (PLEG): container finished" podID="5445c577-835c-432b-88da-5fe7a9107cac" containerID="8783667709de39284bf8a0978d3e5305ddb1bbe37e2e1b4d4326bad59c79ca5b" exitCode=137 Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.153152 4964 generic.go:334] "Generic (PLEG): container finished" podID="5445c577-835c-432b-88da-5fe7a9107cac" containerID="3d537acce4c0d406a39324f85e4cce0afb66f09ae91bbb7394f6c50a50f4c3bc" exitCode=137 Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.153179 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66574674fc-j28wh" event={"ID":"5445c577-835c-432b-88da-5fe7a9107cac","Type":"ContainerDied","Data":"8783667709de39284bf8a0978d3e5305ddb1bbe37e2e1b4d4326bad59c79ca5b"} Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.153195 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66574674fc-j28wh" event={"ID":"5445c577-835c-432b-88da-5fe7a9107cac","Type":"ContainerDied","Data":"3d537acce4c0d406a39324f85e4cce0afb66f09ae91bbb7394f6c50a50f4c3bc"} Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.178417 4964 generic.go:334] "Generic (PLEG): container finished" podID="f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" containerID="0ce8dbbb5dd7e3cca0a3e8c8932e3c5bad8003665614294f6aa1043c4fbd05e7" exitCode=0 Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.178444 4964 generic.go:334] "Generic (PLEG): container finished" podID="f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" containerID="e1ced1ecd5addab616a9d225147c8e026b5abeffa1dd03e7d25e65bfcd7e7086" exitCode=143 Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.178484 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4","Type":"ContainerDied","Data":"0ce8dbbb5dd7e3cca0a3e8c8932e3c5bad8003665614294f6aa1043c4fbd05e7"} Oct 04 03:31:57 crc kubenswrapper[4964]: I1004 03:31:57.178511 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4","Type":"ContainerDied","Data":"e1ced1ecd5addab616a9d225147c8e026b5abeffa1dd03e7d25e65bfcd7e7086"} Oct 04 03:31:58 crc kubenswrapper[4964]: I1004 03:31:58.847901 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:31:58 crc kubenswrapper[4964]: E1004 03:31:58.848550 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.202503 4964 generic.go:334] "Generic (PLEG): container finished" podID="69b09710-8018-47e1-9d2e-36df63451268" containerID="ebd282f42eb883e68535e463ecfb15bc70d23b212b1d750db8bb89675e00b115" exitCode=0 Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.202560 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69b09710-8018-47e1-9d2e-36df63451268","Type":"ContainerDied","Data":"ebd282f42eb883e68535e463ecfb15bc70d23b212b1d750db8bb89675e00b115"} Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.898935 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.899893 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.900850 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989348 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-config-data\") pod \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989395 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5445c577-835c-432b-88da-5fe7a9107cac-horizon-secret-key\") pod \"5445c577-835c-432b-88da-5fe7a9107cac\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989429 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln24f\" (UniqueName: \"kubernetes.io/projected/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-kube-api-access-ln24f\") pod \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989470 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-config-data\") pod \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989550 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-logs\") pod \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989582 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-scripts\") pod \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989601 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-combined-ca-bundle\") pod \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989634 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-horizon-secret-key\") pod \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989705 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5445c577-835c-432b-88da-5fe7a9107cac-logs\") pod \"5445c577-835c-432b-88da-5fe7a9107cac\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989731 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5445c577-835c-432b-88da-5fe7a9107cac-scripts\") pod \"5445c577-835c-432b-88da-5fe7a9107cac\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989750 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-config-data-custom\") pod \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989817 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcmbz\" (UniqueName: \"kubernetes.io/projected/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-kube-api-access-fcmbz\") pod \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989836 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-logs\") pod \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\" (UID: \"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989852 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l846f\" (UniqueName: \"kubernetes.io/projected/5445c577-835c-432b-88da-5fe7a9107cac-kube-api-access-l846f\") pod \"5445c577-835c-432b-88da-5fe7a9107cac\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989878 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-etc-machine-id\") pod \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989912 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5445c577-835c-432b-88da-5fe7a9107cac-config-data\") pod \"5445c577-835c-432b-88da-5fe7a9107cac\" (UID: \"5445c577-835c-432b-88da-5fe7a9107cac\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.989926 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-scripts\") pod \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\" (UID: \"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4\") " Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.997473 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-scripts" (OuterVolumeSpecName: "scripts") pod "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" (UID: "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:31:59 crc kubenswrapper[4964]: I1004 03:31:59.997843 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5445c577-835c-432b-88da-5fe7a9107cac-logs" (OuterVolumeSpecName: "logs") pod "5445c577-835c-432b-88da-5fe7a9107cac" (UID: "5445c577-835c-432b-88da-5fe7a9107cac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.001119 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" (UID: "5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.002647 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5445c577-835c-432b-88da-5fe7a9107cac-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5445c577-835c-432b-88da-5fe7a9107cac" (UID: "5445c577-835c-432b-88da-5fe7a9107cac"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.002927 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-logs" (OuterVolumeSpecName: "logs") pod "5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" (UID: "5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.003714 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" (UID: "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.006318 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-logs" (OuterVolumeSpecName: "logs") pod "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" (UID: "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.009753 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5445c577-835c-432b-88da-5fe7a9107cac-kube-api-access-l846f" (OuterVolumeSpecName: "kube-api-access-l846f") pod "5445c577-835c-432b-88da-5fe7a9107cac" (UID: "5445c577-835c-432b-88da-5fe7a9107cac"). InnerVolumeSpecName "kube-api-access-l846f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.009850 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" (UID: "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.011823 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-kube-api-access-ln24f" (OuterVolumeSpecName: "kube-api-access-ln24f") pod "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" (UID: "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4"). InnerVolumeSpecName "kube-api-access-ln24f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.021463 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.031381 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-kube-api-access-fcmbz" (OuterVolumeSpecName: "kube-api-access-fcmbz") pod "5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" (UID: "5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d"). InnerVolumeSpecName "kube-api-access-fcmbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.034820 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5445c577-835c-432b-88da-5fe7a9107cac-config-data" (OuterVolumeSpecName: "config-data") pod "5445c577-835c-432b-88da-5fe7a9107cac" (UID: "5445c577-835c-432b-88da-5fe7a9107cac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.038152 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5445c577-835c-432b-88da-5fe7a9107cac-scripts" (OuterVolumeSpecName: "scripts") pod "5445c577-835c-432b-88da-5fe7a9107cac" (UID: "5445c577-835c-432b-88da-5fe7a9107cac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.046592 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-config-data" (OuterVolumeSpecName: "config-data") pod "5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" (UID: "5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.050527 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" (UID: "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.063476 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-scripts" (OuterVolumeSpecName: "scripts") pod "5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" (UID: "5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096079 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096116 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-logs\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096126 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096135 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096145 4964 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096157 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5445c577-835c-432b-88da-5fe7a9107cac-logs\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096167 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5445c577-835c-432b-88da-5fe7a9107cac-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096180 4964 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096189 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcmbz\" (UniqueName: \"kubernetes.io/projected/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-kube-api-access-fcmbz\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096202 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d-logs\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096210 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l846f\" (UniqueName: \"kubernetes.io/projected/5445c577-835c-432b-88da-5fe7a9107cac-kube-api-access-l846f\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096219 4964 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096229 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5445c577-835c-432b-88da-5fe7a9107cac-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096241 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096249 4964 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5445c577-835c-432b-88da-5fe7a9107cac-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.096257 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln24f\" (UniqueName: \"kubernetes.io/projected/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-kube-api-access-ln24f\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.122379 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-config-data" (OuterVolumeSpecName: "config-data") pod "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" (UID: "f9dcd1a9-b69f-41cf-916c-8d9b687d2af4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.197610 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b09710-8018-47e1-9d2e-36df63451268-log-httpd\") pod \"69b09710-8018-47e1-9d2e-36df63451268\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.197683 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf98s\" (UniqueName: \"kubernetes.io/projected/69b09710-8018-47e1-9d2e-36df63451268-kube-api-access-pf98s\") pod \"69b09710-8018-47e1-9d2e-36df63451268\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.197759 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-combined-ca-bundle\") pod \"69b09710-8018-47e1-9d2e-36df63451268\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.197907 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-ceilometer-tls-certs\") pod \"69b09710-8018-47e1-9d2e-36df63451268\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.197929 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-sg-core-conf-yaml\") pod \"69b09710-8018-47e1-9d2e-36df63451268\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.197948 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b09710-8018-47e1-9d2e-36df63451268-run-httpd\") pod \"69b09710-8018-47e1-9d2e-36df63451268\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.197969 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-scripts\") pod \"69b09710-8018-47e1-9d2e-36df63451268\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.197995 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-config-data\") pod \"69b09710-8018-47e1-9d2e-36df63451268\" (UID: \"69b09710-8018-47e1-9d2e-36df63451268\") " Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.198349 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.199972 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69b09710-8018-47e1-9d2e-36df63451268-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69b09710-8018-47e1-9d2e-36df63451268" (UID: "69b09710-8018-47e1-9d2e-36df63451268"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.200293 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69b09710-8018-47e1-9d2e-36df63451268-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69b09710-8018-47e1-9d2e-36df63451268" (UID: "69b09710-8018-47e1-9d2e-36df63451268"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.204181 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69b09710-8018-47e1-9d2e-36df63451268-kube-api-access-pf98s" (OuterVolumeSpecName: "kube-api-access-pf98s") pod "69b09710-8018-47e1-9d2e-36df63451268" (UID: "69b09710-8018-47e1-9d2e-36df63451268"). InnerVolumeSpecName "kube-api-access-pf98s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.205747 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-scripts" (OuterVolumeSpecName: "scripts") pod "69b09710-8018-47e1-9d2e-36df63451268" (UID: "69b09710-8018-47e1-9d2e-36df63451268"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.220342 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6554d5fc67-d5vgm" event={"ID":"5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d","Type":"ContainerDied","Data":"1ab3f4210a94abe3baa23dd8411c0c61b5dfd3319dad37c40905fd95336b3956"} Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.220396 4964 scope.go:117] "RemoveContainer" containerID="f9dba9d2f9e7012d4bbce2855cf56cc639bc92f129ebbfe7db2e8843f3dfc6f1" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.220395 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6554d5fc67-d5vgm" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.223508 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66574674fc-j28wh" event={"ID":"5445c577-835c-432b-88da-5fe7a9107cac","Type":"ContainerDied","Data":"0aa030399cef43c850f53d761fe47758ecf88c1a4258fd81a4d164d9e8150443"} Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.223609 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66574674fc-j28wh" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.226516 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f9dcd1a9-b69f-41cf-916c-8d9b687d2af4","Type":"ContainerDied","Data":"1005c32b1d60e58922a23a170a7555226fc80a72dbfc2fa0f307d770d6dcf67b"} Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.226545 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.230464 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69b09710-8018-47e1-9d2e-36df63451268","Type":"ContainerDied","Data":"b971ffdef91e04f926fbbb7504b40e58d0a2c82ebfac494573bcf6ede57022b0"} Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.230575 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.250544 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69b09710-8018-47e1-9d2e-36df63451268" (UID: "69b09710-8018-47e1-9d2e-36df63451268"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.283637 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6554d5fc67-d5vgm"] Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.284191 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "69b09710-8018-47e1-9d2e-36df63451268" (UID: "69b09710-8018-47e1-9d2e-36df63451268"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.305640 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6554d5fc67-d5vgm"] Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.308957 4964 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b09710-8018-47e1-9d2e-36df63451268-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.309174 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf98s\" (UniqueName: \"kubernetes.io/projected/69b09710-8018-47e1-9d2e-36df63451268-kube-api-access-pf98s\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.309257 4964 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.309350 4964 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.309423 4964 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b09710-8018-47e1-9d2e-36df63451268-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.309514 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.330348 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69b09710-8018-47e1-9d2e-36df63451268" (UID: "69b09710-8018-47e1-9d2e-36df63451268"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.336185 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66574674fc-j28wh"] Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.354799 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-config-data" (OuterVolumeSpecName: "config-data") pod "69b09710-8018-47e1-9d2e-36df63451268" (UID: "69b09710-8018-47e1-9d2e-36df63451268"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.360408 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-66574674fc-j28wh"] Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.372888 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.392114 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.399739 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 04 03:32:00 crc kubenswrapper[4964]: E1004 03:32:00.400139 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" containerName="manila-api-log" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400157 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" containerName="manila-api-log" Oct 04 03:32:00 crc kubenswrapper[4964]: E1004 03:32:00.400170 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" containerName="horizon" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400178 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" containerName="horizon" Oct 04 03:32:00 crc kubenswrapper[4964]: E1004 03:32:00.400195 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="ceilometer-central-agent" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400202 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="ceilometer-central-agent" Oct 04 03:32:00 crc kubenswrapper[4964]: E1004 03:32:00.400217 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="ceilometer-notification-agent" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400222 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="ceilometer-notification-agent" Oct 04 03:32:00 crc kubenswrapper[4964]: E1004 03:32:00.400233 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5445c577-835c-432b-88da-5fe7a9107cac" containerName="horizon-log" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400238 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5445c577-835c-432b-88da-5fe7a9107cac" containerName="horizon-log" Oct 04 03:32:00 crc kubenswrapper[4964]: E1004 03:32:00.400250 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" containerName="manila-api" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400256 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" containerName="manila-api" Oct 04 03:32:00 crc kubenswrapper[4964]: E1004 03:32:00.400275 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" containerName="horizon-log" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400281 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" containerName="horizon-log" Oct 04 03:32:00 crc kubenswrapper[4964]: E1004 03:32:00.400293 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5445c577-835c-432b-88da-5fe7a9107cac" containerName="horizon" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400300 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5445c577-835c-432b-88da-5fe7a9107cac" containerName="horizon" Oct 04 03:32:00 crc kubenswrapper[4964]: E1004 03:32:00.400312 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="proxy-httpd" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400319 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="proxy-httpd" Oct 04 03:32:00 crc kubenswrapper[4964]: E1004 03:32:00.400329 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="sg-core" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400335 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="sg-core" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400497 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="ceilometer-central-agent" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400510 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="5445c577-835c-432b-88da-5fe7a9107cac" containerName="horizon-log" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400521 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="ceilometer-notification-agent" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400529 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" containerName="horizon-log" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400541 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="proxy-httpd" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400551 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" containerName="horizon" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400565 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="5445c577-835c-432b-88da-5fe7a9107cac" containerName="horizon" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400573 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" containerName="manila-api-log" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400579 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="69b09710-8018-47e1-9d2e-36df63451268" containerName="sg-core" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.400586 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" containerName="manila-api" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.401609 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.405561 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.405725 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.405856 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.407676 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.412862 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.412897 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b09710-8018-47e1-9d2e-36df63451268-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.416070 4964 scope.go:117] "RemoveContainer" containerID="d40fffc9c4239f14b7da2efa8b90575b3f120bb991d1de67ba17d642fc751516" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.442322 4964 scope.go:117] "RemoveContainer" containerID="8783667709de39284bf8a0978d3e5305ddb1bbe37e2e1b4d4326bad59c79ca5b" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.514835 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbtvn\" (UniqueName: \"kubernetes.io/projected/e937ceaf-8613-4a23-a565-adafc14c8172-kube-api-access-bbtvn\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.514876 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e937ceaf-8613-4a23-a565-adafc14c8172-logs\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.515048 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-public-tls-certs\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.515220 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-internal-tls-certs\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.515442 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-scripts\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.515516 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.515705 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-config-data\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.515791 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-config-data-custom\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.515918 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e937ceaf-8613-4a23-a565-adafc14c8172-etc-machine-id\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.569020 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.579885 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.589068 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.591898 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.594440 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.594680 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.599468 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.603401 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.627442 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e937ceaf-8613-4a23-a565-adafc14c8172-etc-machine-id\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.627524 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbtvn\" (UniqueName: \"kubernetes.io/projected/e937ceaf-8613-4a23-a565-adafc14c8172-kube-api-access-bbtvn\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.627547 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e937ceaf-8613-4a23-a565-adafc14c8172-logs\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.627576 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-public-tls-certs\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.627627 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-internal-tls-certs\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.627675 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-scripts\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.627694 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.627711 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-config-data\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.627733 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-config-data-custom\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.640411 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e937ceaf-8613-4a23-a565-adafc14c8172-logs\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.642817 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-config-data-custom\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.642912 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-public-tls-certs\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.644818 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.644872 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e937ceaf-8613-4a23-a565-adafc14c8172-etc-machine-id\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.647808 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-scripts\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.651123 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-config-data\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.656564 4964 scope.go:117] "RemoveContainer" containerID="3d537acce4c0d406a39324f85e4cce0afb66f09ae91bbb7394f6c50a50f4c3bc" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.693698 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e937ceaf-8613-4a23-a565-adafc14c8172-internal-tls-certs\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.696672 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbtvn\" (UniqueName: \"kubernetes.io/projected/e937ceaf-8613-4a23-a565-adafc14c8172-kube-api-access-bbtvn\") pod \"manila-api-0\" (UID: \"e937ceaf-8613-4a23-a565-adafc14c8172\") " pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.728937 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-config-data\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.728986 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8p8n\" (UniqueName: \"kubernetes.io/projected/6ffefef7-5afc-46df-a54e-0d06e34c499c-kube-api-access-c8p8n\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.729026 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-scripts\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.729050 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffefef7-5afc-46df-a54e-0d06e34c499c-run-httpd\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.729101 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.729197 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.729269 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffefef7-5afc-46df-a54e-0d06e34c499c-log-httpd\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.729367 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.732135 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.830899 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-scripts\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.831258 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffefef7-5afc-46df-a54e-0d06e34c499c-run-httpd\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.831336 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.831469 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.831580 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffefef7-5afc-46df-a54e-0d06e34c499c-log-httpd\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.831650 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.831695 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-config-data\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.831753 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8p8n\" (UniqueName: \"kubernetes.io/projected/6ffefef7-5afc-46df-a54e-0d06e34c499c-kube-api-access-c8p8n\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.833441 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffefef7-5afc-46df-a54e-0d06e34c499c-log-httpd\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.837200 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffefef7-5afc-46df-a54e-0d06e34c499c-run-httpd\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.837529 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.837750 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.840229 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-scripts\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.842428 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-config-data\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.842940 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.855357 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8p8n\" (UniqueName: \"kubernetes.io/projected/6ffefef7-5afc-46df-a54e-0d06e34c499c-kube-api-access-c8p8n\") pod \"ceilometer-0\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " pod="openstack/ceilometer-0" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.860573 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5445c577-835c-432b-88da-5fe7a9107cac" path="/var/lib/kubelet/pods/5445c577-835c-432b-88da-5fe7a9107cac/volumes" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.861601 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d" path="/var/lib/kubelet/pods/5ba15c1a-7eaa-4e8b-b068-3f4ee640d55d/volumes" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.865760 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69b09710-8018-47e1-9d2e-36df63451268" path="/var/lib/kubelet/pods/69b09710-8018-47e1-9d2e-36df63451268/volumes" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.867032 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9dcd1a9-b69f-41cf-916c-8d9b687d2af4" path="/var/lib/kubelet/pods/f9dcd1a9-b69f-41cf-916c-8d9b687d2af4/volumes" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.923678 4964 scope.go:117] "RemoveContainer" containerID="0ce8dbbb5dd7e3cca0a3e8c8932e3c5bad8003665614294f6aa1043c4fbd05e7" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.960922 4964 scope.go:117] "RemoveContainer" containerID="e1ced1ecd5addab616a9d225147c8e026b5abeffa1dd03e7d25e65bfcd7e7086" Oct 04 03:32:00 crc kubenswrapper[4964]: I1004 03:32:00.993300 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 03:32:01 crc kubenswrapper[4964]: I1004 03:32:01.004753 4964 scope.go:117] "RemoveContainer" containerID="577d8dac90a628c448abf855afcc52a97c16028ad831736c1f330c9ba30b49df" Oct 04 03:32:01 crc kubenswrapper[4964]: I1004 03:32:01.055727 4964 scope.go:117] "RemoveContainer" containerID="beda92b6737d95754b37686a36cd2441d380f20fa24da514e8742cc4389d9723" Oct 04 03:32:01 crc kubenswrapper[4964]: I1004 03:32:01.095371 4964 scope.go:117] "RemoveContainer" containerID="ebd282f42eb883e68535e463ecfb15bc70d23b212b1d750db8bb89675e00b115" Oct 04 03:32:01 crc kubenswrapper[4964]: I1004 03:32:01.124023 4964 scope.go:117] "RemoveContainer" containerID="b7423db947cba0f404cde65df2af840f7f0bee247ed7824d2aaa9a946367c368" Oct 04 03:32:01 crc kubenswrapper[4964]: I1004 03:32:01.256729 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83","Type":"ContainerStarted","Data":"3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba"} Oct 04 03:32:01 crc kubenswrapper[4964]: I1004 03:32:01.308431 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 04 03:32:01 crc kubenswrapper[4964]: I1004 03:32:01.519391 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 03:32:01 crc kubenswrapper[4964]: W1004 03:32:01.525983 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ffefef7_5afc_46df_a54e_0d06e34c499c.slice/crio-83fb5c0544ef07d9db6b5aa6e5afbe8f806c890e08ddaffb50df0c5f17defe85 WatchSource:0}: Error finding container 83fb5c0544ef07d9db6b5aa6e5afbe8f806c890e08ddaffb50df0c5f17defe85: Status 404 returned error can't find the container with id 83fb5c0544ef07d9db6b5aa6e5afbe8f806c890e08ddaffb50df0c5f17defe85 Oct 04 03:32:01 crc kubenswrapper[4964]: I1004 03:32:01.795347 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 04 03:32:02 crc kubenswrapper[4964]: I1004 03:32:02.111643 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b5fdb995-wn8qh" Oct 04 03:32:02 crc kubenswrapper[4964]: I1004 03:32:02.166805 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-f97dv"] Oct 04 03:32:02 crc kubenswrapper[4964]: I1004 03:32:02.167033 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" podUID="2d8985fb-442a-4daa-bf40-9079a2e62aba" containerName="dnsmasq-dns" containerID="cri-o://aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138" gracePeriod=10 Oct 04 03:32:02 crc kubenswrapper[4964]: I1004 03:32:02.340911 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e937ceaf-8613-4a23-a565-adafc14c8172","Type":"ContainerStarted","Data":"6fe32a7b76dd978963ff456881dc143e7b2a8c7c3ed97fcd716260c14d717ed8"} Oct 04 03:32:02 crc kubenswrapper[4964]: I1004 03:32:02.341084 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e937ceaf-8613-4a23-a565-adafc14c8172","Type":"ContainerStarted","Data":"f64c7cdfc7c0763cb5cba335ef0f31e3b06fe703a519d7a9dc6cdd5f32a19b4f"} Oct 04 03:32:02 crc kubenswrapper[4964]: I1004 03:32:02.359816 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ffefef7-5afc-46df-a54e-0d06e34c499c","Type":"ContainerStarted","Data":"83fb5c0544ef07d9db6b5aa6e5afbe8f806c890e08ddaffb50df0c5f17defe85"} Oct 04 03:32:02 crc kubenswrapper[4964]: I1004 03:32:02.372217 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83","Type":"ContainerStarted","Data":"2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9"} Oct 04 03:32:02 crc kubenswrapper[4964]: I1004 03:32:02.431137 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.079538572 podStartE2EDuration="11.43111797s" podCreationTimestamp="2025-10-04 03:31:51 +0000 UTC" firstStartedPulling="2025-10-04 03:31:52.413989756 +0000 UTC m=+3092.310948394" lastFinishedPulling="2025-10-04 03:31:59.765569154 +0000 UTC m=+3099.662527792" observedRunningTime="2025-10-04 03:32:02.423842758 +0000 UTC m=+3102.320801396" watchObservedRunningTime="2025-10-04 03:32:02.43111797 +0000 UTC m=+3102.328076608" Oct 04 03:32:02 crc kubenswrapper[4964]: I1004 03:32:02.871658 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.003668 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-config\") pod \"2d8985fb-442a-4daa-bf40-9079a2e62aba\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.003796 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-ovsdbserver-nb\") pod \"2d8985fb-442a-4daa-bf40-9079a2e62aba\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.003842 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-dns-svc\") pod \"2d8985fb-442a-4daa-bf40-9079a2e62aba\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.003889 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-ovsdbserver-sb\") pod \"2d8985fb-442a-4daa-bf40-9079a2e62aba\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.004070 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvnzx\" (UniqueName: \"kubernetes.io/projected/2d8985fb-442a-4daa-bf40-9079a2e62aba-kube-api-access-rvnzx\") pod \"2d8985fb-442a-4daa-bf40-9079a2e62aba\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.004122 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-openstack-edpm-ipam\") pod \"2d8985fb-442a-4daa-bf40-9079a2e62aba\" (UID: \"2d8985fb-442a-4daa-bf40-9079a2e62aba\") " Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.017828 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d8985fb-442a-4daa-bf40-9079a2e62aba-kube-api-access-rvnzx" (OuterVolumeSpecName: "kube-api-access-rvnzx") pod "2d8985fb-442a-4daa-bf40-9079a2e62aba" (UID: "2d8985fb-442a-4daa-bf40-9079a2e62aba"). InnerVolumeSpecName "kube-api-access-rvnzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.077590 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d8985fb-442a-4daa-bf40-9079a2e62aba" (UID: "2d8985fb-442a-4daa-bf40-9079a2e62aba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.084569 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2d8985fb-442a-4daa-bf40-9079a2e62aba" (UID: "2d8985fb-442a-4daa-bf40-9079a2e62aba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.088137 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2d8985fb-442a-4daa-bf40-9079a2e62aba" (UID: "2d8985fb-442a-4daa-bf40-9079a2e62aba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.101773 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-config" (OuterVolumeSpecName: "config") pod "2d8985fb-442a-4daa-bf40-9079a2e62aba" (UID: "2d8985fb-442a-4daa-bf40-9079a2e62aba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.106784 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvnzx\" (UniqueName: \"kubernetes.io/projected/2d8985fb-442a-4daa-bf40-9079a2e62aba-kube-api-access-rvnzx\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.106818 4964 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-config\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.106828 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.106840 4964 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.106848 4964 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.151210 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "2d8985fb-442a-4daa-bf40-9079a2e62aba" (UID: "2d8985fb-442a-4daa-bf40-9079a2e62aba"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.209393 4964 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2d8985fb-442a-4daa-bf40-9079a2e62aba-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.384160 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e937ceaf-8613-4a23-a565-adafc14c8172","Type":"ContainerStarted","Data":"93d5c507dad9a47e94d218d6d20a99a9e6daf8266eb585aa784aa36410a74eb7"} Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.386824 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.389690 4964 generic.go:334] "Generic (PLEG): container finished" podID="2d8985fb-442a-4daa-bf40-9079a2e62aba" containerID="aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138" exitCode=0 Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.389747 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.389751 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" event={"ID":"2d8985fb-442a-4daa-bf40-9079a2e62aba","Type":"ContainerDied","Data":"aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138"} Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.389795 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-f97dv" event={"ID":"2d8985fb-442a-4daa-bf40-9079a2e62aba","Type":"ContainerDied","Data":"015542d47f2c70243c6113e88c5c32252ef4e64582a93add3479d68d345bca2b"} Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.389813 4964 scope.go:117] "RemoveContainer" containerID="aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.392273 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ffefef7-5afc-46df-a54e-0d06e34c499c","Type":"ContainerStarted","Data":"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404"} Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.404562 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.4045470939999998 podStartE2EDuration="3.404547094s" podCreationTimestamp="2025-10-04 03:32:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:32:03.404479873 +0000 UTC m=+3103.301438511" watchObservedRunningTime="2025-10-04 03:32:03.404547094 +0000 UTC m=+3103.301505732" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.416509 4964 scope.go:117] "RemoveContainer" containerID="98c5e85573c9bbd808758517c70c4022c10d8eb1035478b0e7f58b8bb3eba973" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.429754 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-f97dv"] Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.436783 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-f97dv"] Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.438802 4964 scope.go:117] "RemoveContainer" containerID="aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138" Oct 04 03:32:03 crc kubenswrapper[4964]: E1004 03:32:03.439233 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138\": container with ID starting with aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138 not found: ID does not exist" containerID="aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.439264 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138"} err="failed to get container status \"aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138\": rpc error: code = NotFound desc = could not find container \"aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138\": container with ID starting with aa4d9b8c9647c5be10e60a465a36cd7576e892c9ad32c63be67173655fda3138 not found: ID does not exist" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.439283 4964 scope.go:117] "RemoveContainer" containerID="98c5e85573c9bbd808758517c70c4022c10d8eb1035478b0e7f58b8bb3eba973" Oct 04 03:32:03 crc kubenswrapper[4964]: E1004 03:32:03.439629 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c5e85573c9bbd808758517c70c4022c10d8eb1035478b0e7f58b8bb3eba973\": container with ID starting with 98c5e85573c9bbd808758517c70c4022c10d8eb1035478b0e7f58b8bb3eba973 not found: ID does not exist" containerID="98c5e85573c9bbd808758517c70c4022c10d8eb1035478b0e7f58b8bb3eba973" Oct 04 03:32:03 crc kubenswrapper[4964]: I1004 03:32:03.439652 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c5e85573c9bbd808758517c70c4022c10d8eb1035478b0e7f58b8bb3eba973"} err="failed to get container status \"98c5e85573c9bbd808758517c70c4022c10d8eb1035478b0e7f58b8bb3eba973\": rpc error: code = NotFound desc = could not find container \"98c5e85573c9bbd808758517c70c4022c10d8eb1035478b0e7f58b8bb3eba973\": container with ID starting with 98c5e85573c9bbd808758517c70c4022c10d8eb1035478b0e7f58b8bb3eba973 not found: ID does not exist" Oct 04 03:32:04 crc kubenswrapper[4964]: I1004 03:32:04.397884 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 03:32:04 crc kubenswrapper[4964]: I1004 03:32:04.405964 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ffefef7-5afc-46df-a54e-0d06e34c499c","Type":"ContainerStarted","Data":"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1"} Oct 04 03:32:04 crc kubenswrapper[4964]: I1004 03:32:04.856709 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d8985fb-442a-4daa-bf40-9079a2e62aba" path="/var/lib/kubelet/pods/2d8985fb-442a-4daa-bf40-9079a2e62aba/volumes" Oct 04 03:32:05 crc kubenswrapper[4964]: I1004 03:32:05.418555 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ffefef7-5afc-46df-a54e-0d06e34c499c","Type":"ContainerStarted","Data":"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17"} Oct 04 03:32:06 crc kubenswrapper[4964]: I1004 03:32:06.163106 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67d99bc788-zm78q" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.245:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.245:8443: connect: connection refused" Oct 04 03:32:07 crc kubenswrapper[4964]: I1004 03:32:07.444735 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ffefef7-5afc-46df-a54e-0d06e34c499c","Type":"ContainerStarted","Data":"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b"} Oct 04 03:32:07 crc kubenswrapper[4964]: I1004 03:32:07.445482 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="ceilometer-central-agent" containerID="cri-o://1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404" gracePeriod=30 Oct 04 03:32:07 crc kubenswrapper[4964]: I1004 03:32:07.445885 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 03:32:07 crc kubenswrapper[4964]: I1004 03:32:07.446264 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="proxy-httpd" containerID="cri-o://b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b" gracePeriod=30 Oct 04 03:32:07 crc kubenswrapper[4964]: I1004 03:32:07.446349 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="sg-core" containerID="cri-o://36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17" gracePeriod=30 Oct 04 03:32:07 crc kubenswrapper[4964]: I1004 03:32:07.446409 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="ceilometer-notification-agent" containerID="cri-o://56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1" gracePeriod=30 Oct 04 03:32:07 crc kubenswrapper[4964]: I1004 03:32:07.481978 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.568439722 podStartE2EDuration="7.481954962s" podCreationTimestamp="2025-10-04 03:32:00 +0000 UTC" firstStartedPulling="2025-10-04 03:32:01.52888428 +0000 UTC m=+3101.425842918" lastFinishedPulling="2025-10-04 03:32:06.44239952 +0000 UTC m=+3106.339358158" observedRunningTime="2025-10-04 03:32:07.476095516 +0000 UTC m=+3107.373054234" watchObservedRunningTime="2025-10-04 03:32:07.481954962 +0000 UTC m=+3107.378913610" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.244985 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.424695 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-ceilometer-tls-certs\") pod \"6ffefef7-5afc-46df-a54e-0d06e34c499c\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.424826 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-config-data\") pod \"6ffefef7-5afc-46df-a54e-0d06e34c499c\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.424911 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-scripts\") pod \"6ffefef7-5afc-46df-a54e-0d06e34c499c\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.424962 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8p8n\" (UniqueName: \"kubernetes.io/projected/6ffefef7-5afc-46df-a54e-0d06e34c499c-kube-api-access-c8p8n\") pod \"6ffefef7-5afc-46df-a54e-0d06e34c499c\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.425052 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffefef7-5afc-46df-a54e-0d06e34c499c-run-httpd\") pod \"6ffefef7-5afc-46df-a54e-0d06e34c499c\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.425149 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffefef7-5afc-46df-a54e-0d06e34c499c-log-httpd\") pod \"6ffefef7-5afc-46df-a54e-0d06e34c499c\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.425189 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-combined-ca-bundle\") pod \"6ffefef7-5afc-46df-a54e-0d06e34c499c\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.425221 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-sg-core-conf-yaml\") pod \"6ffefef7-5afc-46df-a54e-0d06e34c499c\" (UID: \"6ffefef7-5afc-46df-a54e-0d06e34c499c\") " Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.425594 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ffefef7-5afc-46df-a54e-0d06e34c499c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6ffefef7-5afc-46df-a54e-0d06e34c499c" (UID: "6ffefef7-5afc-46df-a54e-0d06e34c499c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.425769 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ffefef7-5afc-46df-a54e-0d06e34c499c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6ffefef7-5afc-46df-a54e-0d06e34c499c" (UID: "6ffefef7-5afc-46df-a54e-0d06e34c499c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.427704 4964 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffefef7-5afc-46df-a54e-0d06e34c499c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.427758 4964 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffefef7-5afc-46df-a54e-0d06e34c499c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.431282 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ffefef7-5afc-46df-a54e-0d06e34c499c-kube-api-access-c8p8n" (OuterVolumeSpecName: "kube-api-access-c8p8n") pod "6ffefef7-5afc-46df-a54e-0d06e34c499c" (UID: "6ffefef7-5afc-46df-a54e-0d06e34c499c"). InnerVolumeSpecName "kube-api-access-c8p8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.432127 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-scripts" (OuterVolumeSpecName: "scripts") pod "6ffefef7-5afc-46df-a54e-0d06e34c499c" (UID: "6ffefef7-5afc-46df-a54e-0d06e34c499c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.461105 4964 generic.go:334] "Generic (PLEG): container finished" podID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerID="b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b" exitCode=0 Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.461142 4964 generic.go:334] "Generic (PLEG): container finished" podID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerID="36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17" exitCode=2 Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.461152 4964 generic.go:334] "Generic (PLEG): container finished" podID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerID="56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1" exitCode=0 Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.461164 4964 generic.go:334] "Generic (PLEG): container finished" podID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerID="1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404" exitCode=0 Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.461186 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ffefef7-5afc-46df-a54e-0d06e34c499c","Type":"ContainerDied","Data":"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b"} Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.461223 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ffefef7-5afc-46df-a54e-0d06e34c499c","Type":"ContainerDied","Data":"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17"} Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.461239 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ffefef7-5afc-46df-a54e-0d06e34c499c","Type":"ContainerDied","Data":"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1"} Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.461251 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ffefef7-5afc-46df-a54e-0d06e34c499c","Type":"ContainerDied","Data":"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404"} Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.461262 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ffefef7-5afc-46df-a54e-0d06e34c499c","Type":"ContainerDied","Data":"83fb5c0544ef07d9db6b5aa6e5afbe8f806c890e08ddaffb50df0c5f17defe85"} Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.461261 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.461257 4964 scope.go:117] "RemoveContainer" containerID="b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.463470 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6ffefef7-5afc-46df-a54e-0d06e34c499c" (UID: "6ffefef7-5afc-46df-a54e-0d06e34c499c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.486287 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6ffefef7-5afc-46df-a54e-0d06e34c499c" (UID: "6ffefef7-5afc-46df-a54e-0d06e34c499c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.493586 4964 scope.go:117] "RemoveContainer" containerID="36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.523320 4964 scope.go:117] "RemoveContainer" containerID="56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.530658 4964 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.530701 4964 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.530718 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.530734 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8p8n\" (UniqueName: \"kubernetes.io/projected/6ffefef7-5afc-46df-a54e-0d06e34c499c-kube-api-access-c8p8n\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.551978 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ffefef7-5afc-46df-a54e-0d06e34c499c" (UID: "6ffefef7-5afc-46df-a54e-0d06e34c499c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.555391 4964 scope.go:117] "RemoveContainer" containerID="1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.594632 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-config-data" (OuterVolumeSpecName: "config-data") pod "6ffefef7-5afc-46df-a54e-0d06e34c499c" (UID: "6ffefef7-5afc-46df-a54e-0d06e34c499c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.632284 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.632454 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ffefef7-5afc-46df-a54e-0d06e34c499c-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.708170 4964 scope.go:117] "RemoveContainer" containerID="b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b" Oct 04 03:32:08 crc kubenswrapper[4964]: E1004 03:32:08.708934 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b\": container with ID starting with b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b not found: ID does not exist" containerID="b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.709180 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b"} err="failed to get container status \"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b\": rpc error: code = NotFound desc = could not find container \"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b\": container with ID starting with b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.709300 4964 scope.go:117] "RemoveContainer" containerID="36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17" Oct 04 03:32:08 crc kubenswrapper[4964]: E1004 03:32:08.709863 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17\": container with ID starting with 36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17 not found: ID does not exist" containerID="36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.709999 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17"} err="failed to get container status \"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17\": rpc error: code = NotFound desc = could not find container \"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17\": container with ID starting with 36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17 not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.710090 4964 scope.go:117] "RemoveContainer" containerID="56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1" Oct 04 03:32:08 crc kubenswrapper[4964]: E1004 03:32:08.710807 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1\": container with ID starting with 56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1 not found: ID does not exist" containerID="56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.710863 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1"} err="failed to get container status \"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1\": rpc error: code = NotFound desc = could not find container \"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1\": container with ID starting with 56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1 not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.710899 4964 scope.go:117] "RemoveContainer" containerID="1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404" Oct 04 03:32:08 crc kubenswrapper[4964]: E1004 03:32:08.711422 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404\": container with ID starting with 1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404 not found: ID does not exist" containerID="1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.711457 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404"} err="failed to get container status \"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404\": rpc error: code = NotFound desc = could not find container \"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404\": container with ID starting with 1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404 not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.711477 4964 scope.go:117] "RemoveContainer" containerID="b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.711809 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b"} err="failed to get container status \"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b\": rpc error: code = NotFound desc = could not find container \"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b\": container with ID starting with b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.711990 4964 scope.go:117] "RemoveContainer" containerID="36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.712656 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17"} err="failed to get container status \"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17\": rpc error: code = NotFound desc = could not find container \"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17\": container with ID starting with 36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17 not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.712679 4964 scope.go:117] "RemoveContainer" containerID="56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.712943 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1"} err="failed to get container status \"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1\": rpc error: code = NotFound desc = could not find container \"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1\": container with ID starting with 56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1 not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.713029 4964 scope.go:117] "RemoveContainer" containerID="1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.713462 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404"} err="failed to get container status \"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404\": rpc error: code = NotFound desc = could not find container \"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404\": container with ID starting with 1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404 not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.713597 4964 scope.go:117] "RemoveContainer" containerID="b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.713997 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b"} err="failed to get container status \"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b\": rpc error: code = NotFound desc = could not find container \"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b\": container with ID starting with b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.714028 4964 scope.go:117] "RemoveContainer" containerID="36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.714300 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17"} err="failed to get container status \"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17\": rpc error: code = NotFound desc = could not find container \"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17\": container with ID starting with 36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17 not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.714403 4964 scope.go:117] "RemoveContainer" containerID="56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.714758 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1"} err="failed to get container status \"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1\": rpc error: code = NotFound desc = could not find container \"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1\": container with ID starting with 56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1 not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.714782 4964 scope.go:117] "RemoveContainer" containerID="1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.714980 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404"} err="failed to get container status \"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404\": rpc error: code = NotFound desc = could not find container \"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404\": container with ID starting with 1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404 not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.714998 4964 scope.go:117] "RemoveContainer" containerID="b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.715414 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b"} err="failed to get container status \"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b\": rpc error: code = NotFound desc = could not find container \"b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b\": container with ID starting with b6091edf5c78027cb93adebcc38f19008bc286fc3bb2841777a757060ed9176b not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.715493 4964 scope.go:117] "RemoveContainer" containerID="36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.715806 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17"} err="failed to get container status \"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17\": rpc error: code = NotFound desc = could not find container \"36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17\": container with ID starting with 36c57cc9997557b34a82341c0921006bacd418fa76fd1c71542dffd88cd9db17 not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.715826 4964 scope.go:117] "RemoveContainer" containerID="56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.716068 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1"} err="failed to get container status \"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1\": rpc error: code = NotFound desc = could not find container \"56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1\": container with ID starting with 56cb913a89ad11c64d6ce82d0cda0d0489fe8b57039724c3603d36a0fc72e7d1 not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.716112 4964 scope.go:117] "RemoveContainer" containerID="1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.716360 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404"} err="failed to get container status \"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404\": rpc error: code = NotFound desc = could not find container \"1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404\": container with ID starting with 1a15b8bc58a363523bd18cfdde0c5a7ddc9c67d125e94f2a3d393a0034ea2404 not found: ID does not exist" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.801992 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.827647 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.838163 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 03:32:08 crc kubenswrapper[4964]: E1004 03:32:08.838667 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="ceilometer-central-agent" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.838689 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="ceilometer-central-agent" Oct 04 03:32:08 crc kubenswrapper[4964]: E1004 03:32:08.838717 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8985fb-442a-4daa-bf40-9079a2e62aba" containerName="init" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.838727 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8985fb-442a-4daa-bf40-9079a2e62aba" containerName="init" Oct 04 03:32:08 crc kubenswrapper[4964]: E1004 03:32:08.838751 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="ceilometer-notification-agent" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.838760 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="ceilometer-notification-agent" Oct 04 03:32:08 crc kubenswrapper[4964]: E1004 03:32:08.838777 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="proxy-httpd" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.838785 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="proxy-httpd" Oct 04 03:32:08 crc kubenswrapper[4964]: E1004 03:32:08.838800 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="sg-core" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.838809 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="sg-core" Oct 04 03:32:08 crc kubenswrapper[4964]: E1004 03:32:08.838832 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8985fb-442a-4daa-bf40-9079a2e62aba" containerName="dnsmasq-dns" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.838840 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8985fb-442a-4daa-bf40-9079a2e62aba" containerName="dnsmasq-dns" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.839056 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="proxy-httpd" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.839082 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="ceilometer-notification-agent" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.839103 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d8985fb-442a-4daa-bf40-9079a2e62aba" containerName="dnsmasq-dns" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.839119 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="sg-core" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.839132 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" containerName="ceilometer-central-agent" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.841251 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.845164 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.862384 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.862687 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.862846 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 03:32:08 crc kubenswrapper[4964]: I1004 03:32:08.879011 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ffefef7-5afc-46df-a54e-0d06e34c499c" path="/var/lib/kubelet/pods/6ffefef7-5afc-46df-a54e-0d06e34c499c/volumes" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.039133 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dm8q\" (UniqueName: \"kubernetes.io/projected/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-kube-api-access-7dm8q\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.039445 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-config-data\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.039516 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.039577 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-run-httpd\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.039637 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-log-httpd\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.039677 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-scripts\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.039743 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.039783 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.141459 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.141583 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.141707 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dm8q\" (UniqueName: \"kubernetes.io/projected/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-kube-api-access-7dm8q\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.141746 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-config-data\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.141815 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.141858 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-run-httpd\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.141897 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-log-httpd\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.141955 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-scripts\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.143176 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-log-httpd\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.143916 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-run-httpd\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.148509 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.156004 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.157287 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-scripts\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.164498 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-config-data\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.165208 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.181735 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dm8q\" (UniqueName: \"kubernetes.io/projected/f1f15c57-eddd-4228-b863-b9e9cd1e3c71-kube-api-access-7dm8q\") pod \"ceilometer-0\" (UID: \"f1f15c57-eddd-4228-b863-b9e9cd1e3c71\") " pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.186127 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 03:32:09 crc kubenswrapper[4964]: I1004 03:32:09.663237 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 03:32:10 crc kubenswrapper[4964]: I1004 03:32:10.491301 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1f15c57-eddd-4228-b863-b9e9cd1e3c71","Type":"ContainerStarted","Data":"f9092a8eaaec7848810acb3237194833767fb0779436934e97029aa92001d84a"} Oct 04 03:32:11 crc kubenswrapper[4964]: I1004 03:32:11.506037 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1f15c57-eddd-4228-b863-b9e9cd1e3c71","Type":"ContainerStarted","Data":"1f35b2587e166a4b7d44a605b41f466bc0c0ea06d5774af6ebdb1d9040c5fee9"} Oct 04 03:32:11 crc kubenswrapper[4964]: I1004 03:32:11.767067 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 04 03:32:12 crc kubenswrapper[4964]: I1004 03:32:12.533495 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1f15c57-eddd-4228-b863-b9e9cd1e3c71","Type":"ContainerStarted","Data":"94c91dee5b8eeb6f71d8e67e570f9629a905140adb3ff2ab91594123fce38875"} Oct 04 03:32:13 crc kubenswrapper[4964]: I1004 03:32:13.211471 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 04 03:32:13 crc kubenswrapper[4964]: I1004 03:32:13.314131 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 04 03:32:13 crc kubenswrapper[4964]: I1004 03:32:13.374165 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 04 03:32:13 crc kubenswrapper[4964]: I1004 03:32:13.442277 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 04 03:32:13 crc kubenswrapper[4964]: I1004 03:32:13.546456 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" containerName="manila-share" containerID="cri-o://3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba" gracePeriod=30 Oct 04 03:32:13 crc kubenswrapper[4964]: I1004 03:32:13.546780 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1f15c57-eddd-4228-b863-b9e9cd1e3c71","Type":"ContainerStarted","Data":"ba90f9f981c8f0615af8849509b861e8d55a18cf0d458f9a1da9faf1ef1c5bde"} Oct 04 03:32:13 crc kubenswrapper[4964]: I1004 03:32:13.546873 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" containerName="probe" containerID="cri-o://2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9" gracePeriod=30 Oct 04 03:32:13 crc kubenswrapper[4964]: I1004 03:32:13.547009 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="bd656502-4058-4689-9474-b5b16bed8695" containerName="manila-scheduler" containerID="cri-o://2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e" gracePeriod=30 Oct 04 03:32:13 crc kubenswrapper[4964]: I1004 03:32:13.546999 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="bd656502-4058-4689-9474-b5b16bed8695" containerName="probe" containerID="cri-o://b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473" gracePeriod=30 Oct 04 03:32:13 crc kubenswrapper[4964]: I1004 03:32:13.845886 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:32:13 crc kubenswrapper[4964]: E1004 03:32:13.846527 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.405654 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.502082 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-combined-ca-bundle\") pod \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.502404 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-etc-machine-id\") pod \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.502429 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-config-data-custom\") pod \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.502467 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" (UID: "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.502517 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9wl9\" (UniqueName: \"kubernetes.io/projected/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-kube-api-access-f9wl9\") pod \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.502568 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-var-lib-manila\") pod \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.502593 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-ceph\") pod \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.502743 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-config-data\") pod \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.502764 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" (UID: "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.502771 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-scripts\") pod \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\" (UID: \"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83\") " Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.503283 4964 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-var-lib-manila\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.503309 4964 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.507728 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-ceph" (OuterVolumeSpecName: "ceph") pod "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" (UID: "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.507842 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" (UID: "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.507897 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-scripts" (OuterVolumeSpecName: "scripts") pod "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" (UID: "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.508091 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-kube-api-access-f9wl9" (OuterVolumeSpecName: "kube-api-access-f9wl9") pod "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" (UID: "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83"). InnerVolumeSpecName "kube-api-access-f9wl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.559116 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" (UID: "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.564408 4964 generic.go:334] "Generic (PLEG): container finished" podID="bd656502-4058-4689-9474-b5b16bed8695" containerID="b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473" exitCode=0 Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.564488 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"bd656502-4058-4689-9474-b5b16bed8695","Type":"ContainerDied","Data":"b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473"} Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.568941 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1f15c57-eddd-4228-b863-b9e9cd1e3c71","Type":"ContainerStarted","Data":"271b410bfd5cc1c2ada6f42d267140b437911fc663812f7bce35700c77701a10"} Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.569119 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.573885 4964 generic.go:334] "Generic (PLEG): container finished" podID="b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" containerID="2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9" exitCode=0 Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.573921 4964 generic.go:334] "Generic (PLEG): container finished" podID="b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" containerID="3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba" exitCode=1 Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.573951 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83","Type":"ContainerDied","Data":"2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9"} Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.573987 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83","Type":"ContainerDied","Data":"3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba"} Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.574002 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83","Type":"ContainerDied","Data":"10381b37714f91b2b65e77d67706337ad6ed92442c519ee3dc040024d8436dae"} Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.574005 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.574032 4964 scope.go:117] "RemoveContainer" containerID="2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.603939 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.536509723 podStartE2EDuration="6.603921465s" podCreationTimestamp="2025-10-04 03:32:08 +0000 UTC" firstStartedPulling="2025-10-04 03:32:09.67254473 +0000 UTC m=+3109.569503368" lastFinishedPulling="2025-10-04 03:32:13.739956432 +0000 UTC m=+3113.636915110" observedRunningTime="2025-10-04 03:32:14.593237021 +0000 UTC m=+3114.490195659" watchObservedRunningTime="2025-10-04 03:32:14.603921465 +0000 UTC m=+3114.500880103" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.605104 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9wl9\" (UniqueName: \"kubernetes.io/projected/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-kube-api-access-f9wl9\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.605136 4964 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-ceph\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.605149 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.605159 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.605169 4964 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.606472 4964 scope.go:117] "RemoveContainer" containerID="3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.626035 4964 scope.go:117] "RemoveContainer" containerID="2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9" Oct 04 03:32:14 crc kubenswrapper[4964]: E1004 03:32:14.626592 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9\": container with ID starting with 2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9 not found: ID does not exist" containerID="2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.626644 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9"} err="failed to get container status \"2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9\": rpc error: code = NotFound desc = could not find container \"2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9\": container with ID starting with 2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9 not found: ID does not exist" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.626669 4964 scope.go:117] "RemoveContainer" containerID="3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba" Oct 04 03:32:14 crc kubenswrapper[4964]: E1004 03:32:14.626915 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba\": container with ID starting with 3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba not found: ID does not exist" containerID="3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.626953 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba"} err="failed to get container status \"3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba\": rpc error: code = NotFound desc = could not find container \"3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba\": container with ID starting with 3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba not found: ID does not exist" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.626987 4964 scope.go:117] "RemoveContainer" containerID="2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.627268 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9"} err="failed to get container status \"2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9\": rpc error: code = NotFound desc = could not find container \"2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9\": container with ID starting with 2b91735ff52dfe35c0e1d0e8f82c55303d7196f5a15ad7d94917644417086fe9 not found: ID does not exist" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.627288 4964 scope.go:117] "RemoveContainer" containerID="3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.627470 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba"} err="failed to get container status \"3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba\": rpc error: code = NotFound desc = could not find container \"3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba\": container with ID starting with 3be733b3555d06a65d9874303015e563bdb08c7007f623826a89309ef1be62ba not found: ID does not exist" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.644796 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-config-data" (OuterVolumeSpecName: "config-data") pod "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" (UID: "b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.707207 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.901286 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.913594 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.929177 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 04 03:32:14 crc kubenswrapper[4964]: E1004 03:32:14.929755 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" containerName="manila-share" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.929779 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" containerName="manila-share" Oct 04 03:32:14 crc kubenswrapper[4964]: E1004 03:32:14.929835 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" containerName="probe" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.929848 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" containerName="probe" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.930130 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" containerName="manila-share" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.930166 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" containerName="probe" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.933571 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.936033 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 04 03:32:14 crc kubenswrapper[4964]: I1004 03:32:14.937365 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.013672 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8a194e94-6624-49cc-ba2a-19860c8c95bf-ceph\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.013737 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8a194e94-6624-49cc-ba2a-19860c8c95bf-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.013779 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a194e94-6624-49cc-ba2a-19860c8c95bf-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.013804 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lcvl\" (UniqueName: \"kubernetes.io/projected/8a194e94-6624-49cc-ba2a-19860c8c95bf-kube-api-access-5lcvl\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.013857 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a194e94-6624-49cc-ba2a-19860c8c95bf-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.013965 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a194e94-6624-49cc-ba2a-19860c8c95bf-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.013996 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a194e94-6624-49cc-ba2a-19860c8c95bf-config-data\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.014023 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a194e94-6624-49cc-ba2a-19860c8c95bf-scripts\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.116582 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8a194e94-6624-49cc-ba2a-19860c8c95bf-ceph\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.117236 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8a194e94-6624-49cc-ba2a-19860c8c95bf-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.117285 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a194e94-6624-49cc-ba2a-19860c8c95bf-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.117310 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lcvl\" (UniqueName: \"kubernetes.io/projected/8a194e94-6624-49cc-ba2a-19860c8c95bf-kube-api-access-5lcvl\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.117332 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a194e94-6624-49cc-ba2a-19860c8c95bf-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.117425 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a194e94-6624-49cc-ba2a-19860c8c95bf-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.117425 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8a194e94-6624-49cc-ba2a-19860c8c95bf-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.117454 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a194e94-6624-49cc-ba2a-19860c8c95bf-config-data\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.117500 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a194e94-6624-49cc-ba2a-19860c8c95bf-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.117684 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a194e94-6624-49cc-ba2a-19860c8c95bf-scripts\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.122088 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a194e94-6624-49cc-ba2a-19860c8c95bf-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.125135 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a194e94-6624-49cc-ba2a-19860c8c95bf-scripts\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.125296 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a194e94-6624-49cc-ba2a-19860c8c95bf-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.135752 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lcvl\" (UniqueName: \"kubernetes.io/projected/8a194e94-6624-49cc-ba2a-19860c8c95bf-kube-api-access-5lcvl\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.138150 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8a194e94-6624-49cc-ba2a-19860c8c95bf-ceph\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.139072 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a194e94-6624-49cc-ba2a-19860c8c95bf-config-data\") pod \"manila-share-share1-0\" (UID: \"8a194e94-6624-49cc-ba2a-19860c8c95bf\") " pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.251155 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 04 03:32:15 crc kubenswrapper[4964]: W1004 03:32:15.784695 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a194e94_6624_49cc_ba2a_19860c8c95bf.slice/crio-ec4aa96c4afa0337c11bc562d043289d278c49df67874fc753d34de826d4f51d WatchSource:0}: Error finding container ec4aa96c4afa0337c11bc562d043289d278c49df67874fc753d34de826d4f51d: Status 404 returned error can't find the container with id ec4aa96c4afa0337c11bc562d043289d278c49df67874fc753d34de826d4f51d Oct 04 03:32:15 crc kubenswrapper[4964]: I1004 03:32:15.786881 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.162275 4964 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67d99bc788-zm78q" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.245:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.245:8443: connect: connection refused" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.162646 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.490523 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.549851 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7497\" (UniqueName: \"kubernetes.io/projected/bd656502-4058-4689-9474-b5b16bed8695-kube-api-access-j7497\") pod \"bd656502-4058-4689-9474-b5b16bed8695\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.549945 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd656502-4058-4689-9474-b5b16bed8695-etc-machine-id\") pod \"bd656502-4058-4689-9474-b5b16bed8695\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.549977 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-scripts\") pod \"bd656502-4058-4689-9474-b5b16bed8695\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.550033 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-config-data\") pod \"bd656502-4058-4689-9474-b5b16bed8695\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.550168 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-config-data-custom\") pod \"bd656502-4058-4689-9474-b5b16bed8695\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.550232 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd656502-4058-4689-9474-b5b16bed8695-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bd656502-4058-4689-9474-b5b16bed8695" (UID: "bd656502-4058-4689-9474-b5b16bed8695"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.550313 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-combined-ca-bundle\") pod \"bd656502-4058-4689-9474-b5b16bed8695\" (UID: \"bd656502-4058-4689-9474-b5b16bed8695\") " Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.550865 4964 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd656502-4058-4689-9474-b5b16bed8695-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.554238 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-scripts" (OuterVolumeSpecName: "scripts") pod "bd656502-4058-4689-9474-b5b16bed8695" (UID: "bd656502-4058-4689-9474-b5b16bed8695"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.554272 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd656502-4058-4689-9474-b5b16bed8695-kube-api-access-j7497" (OuterVolumeSpecName: "kube-api-access-j7497") pod "bd656502-4058-4689-9474-b5b16bed8695" (UID: "bd656502-4058-4689-9474-b5b16bed8695"). InnerVolumeSpecName "kube-api-access-j7497". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.554817 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bd656502-4058-4689-9474-b5b16bed8695" (UID: "bd656502-4058-4689-9474-b5b16bed8695"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.593673 4964 generic.go:334] "Generic (PLEG): container finished" podID="bd656502-4058-4689-9474-b5b16bed8695" containerID="2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e" exitCode=0 Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.593741 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.593755 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"bd656502-4058-4689-9474-b5b16bed8695","Type":"ContainerDied","Data":"2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e"} Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.593785 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"bd656502-4058-4689-9474-b5b16bed8695","Type":"ContainerDied","Data":"c2d78ddde7c972d35c22ae43098996135a8aafa6cd325b7175776d2b861cf127"} Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.593805 4964 scope.go:117] "RemoveContainer" containerID="b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.595953 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8a194e94-6624-49cc-ba2a-19860c8c95bf","Type":"ContainerStarted","Data":"5855552de801382ab69c797b420e2daa5dc9006c053d806cf3c51f70c25bff14"} Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.595985 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8a194e94-6624-49cc-ba2a-19860c8c95bf","Type":"ContainerStarted","Data":"ec4aa96c4afa0337c11bc562d043289d278c49df67874fc753d34de826d4f51d"} Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.596135 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd656502-4058-4689-9474-b5b16bed8695" (UID: "bd656502-4058-4689-9474-b5b16bed8695"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.643791 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-config-data" (OuterVolumeSpecName: "config-data") pod "bd656502-4058-4689-9474-b5b16bed8695" (UID: "bd656502-4058-4689-9474-b5b16bed8695"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.644325 4964 scope.go:117] "RemoveContainer" containerID="2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.653327 4964 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.653350 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.653359 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7497\" (UniqueName: \"kubernetes.io/projected/bd656502-4058-4689-9474-b5b16bed8695-kube-api-access-j7497\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.653368 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.653377 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd656502-4058-4689-9474-b5b16bed8695-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.666574 4964 scope.go:117] "RemoveContainer" containerID="b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473" Oct 04 03:32:16 crc kubenswrapper[4964]: E1004 03:32:16.667003 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473\": container with ID starting with b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473 not found: ID does not exist" containerID="b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.667044 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473"} err="failed to get container status \"b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473\": rpc error: code = NotFound desc = could not find container \"b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473\": container with ID starting with b077f95192c1b79d26dc090f5b63f6c6ec57a08ca5f0751727ddcae7eb659473 not found: ID does not exist" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.667069 4964 scope.go:117] "RemoveContainer" containerID="2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e" Oct 04 03:32:16 crc kubenswrapper[4964]: E1004 03:32:16.667338 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e\": container with ID starting with 2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e not found: ID does not exist" containerID="2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.667380 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e"} err="failed to get container status \"2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e\": rpc error: code = NotFound desc = could not find container \"2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e\": container with ID starting with 2b099a92892afa177c2079c823a382b79dcf4ebdbc54e6d24efcbd04dfa2d37e not found: ID does not exist" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.861017 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83" path="/var/lib/kubelet/pods/b75824f9-0bab-4c5d-87a5-3d9cd6fe6e83/volumes" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.924724 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.933727 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.948832 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 04 03:32:16 crc kubenswrapper[4964]: E1004 03:32:16.949229 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd656502-4058-4689-9474-b5b16bed8695" containerName="manila-scheduler" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.949250 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd656502-4058-4689-9474-b5b16bed8695" containerName="manila-scheduler" Oct 04 03:32:16 crc kubenswrapper[4964]: E1004 03:32:16.949271 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd656502-4058-4689-9474-b5b16bed8695" containerName="probe" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.949280 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd656502-4058-4689-9474-b5b16bed8695" containerName="probe" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.949839 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd656502-4058-4689-9474-b5b16bed8695" containerName="probe" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.949885 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd656502-4058-4689-9474-b5b16bed8695" containerName="manila-scheduler" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.951186 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.952844 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 04 03:32:16 crc kubenswrapper[4964]: I1004 03:32:16.959173 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.058587 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltp2w\" (UniqueName: \"kubernetes.io/projected/0e479211-b081-4b46-93b3-e1ae824dd73a-kube-api-access-ltp2w\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.058968 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e479211-b081-4b46-93b3-e1ae824dd73a-scripts\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.059059 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e479211-b081-4b46-93b3-e1ae824dd73a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.059080 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e479211-b081-4b46-93b3-e1ae824dd73a-config-data\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.059099 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e479211-b081-4b46-93b3-e1ae824dd73a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.059429 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e479211-b081-4b46-93b3-e1ae824dd73a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.161160 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltp2w\" (UniqueName: \"kubernetes.io/projected/0e479211-b081-4b46-93b3-e1ae824dd73a-kube-api-access-ltp2w\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.161244 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e479211-b081-4b46-93b3-e1ae824dd73a-scripts\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.161346 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e479211-b081-4b46-93b3-e1ae824dd73a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.161364 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e479211-b081-4b46-93b3-e1ae824dd73a-config-data\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.161383 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e479211-b081-4b46-93b3-e1ae824dd73a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.161422 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e479211-b081-4b46-93b3-e1ae824dd73a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.161522 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e479211-b081-4b46-93b3-e1ae824dd73a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.167983 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e479211-b081-4b46-93b3-e1ae824dd73a-config-data\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.168156 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e479211-b081-4b46-93b3-e1ae824dd73a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.169065 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e479211-b081-4b46-93b3-e1ae824dd73a-scripts\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.170587 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e479211-b081-4b46-93b3-e1ae824dd73a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.180882 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltp2w\" (UniqueName: \"kubernetes.io/projected/0e479211-b081-4b46-93b3-e1ae824dd73a-kube-api-access-ltp2w\") pod \"manila-scheduler-0\" (UID: \"0e479211-b081-4b46-93b3-e1ae824dd73a\") " pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.264831 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.608974 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8a194e94-6624-49cc-ba2a-19860c8c95bf","Type":"ContainerStarted","Data":"7de7665874e887058091627c662377b0602f64e85c7dd79bbd12488ba295b452"} Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.650994 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.650969135 podStartE2EDuration="3.650969135s" podCreationTimestamp="2025-10-04 03:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:32:17.634139938 +0000 UTC m=+3117.531098606" watchObservedRunningTime="2025-10-04 03:32:17.650969135 +0000 UTC m=+3117.547927783" Oct 04 03:32:17 crc kubenswrapper[4964]: I1004 03:32:17.728876 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 04 03:32:17 crc kubenswrapper[4964]: W1004 03:32:17.752834 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e479211_b081_4b46_93b3_e1ae824dd73a.slice/crio-d9185a5579f947bcf898bbbb1ff8d40cf8c45ce88dc5bca39d89b4578ddfa508 WatchSource:0}: Error finding container d9185a5579f947bcf898bbbb1ff8d40cf8c45ce88dc5bca39d89b4578ddfa508: Status 404 returned error can't find the container with id d9185a5579f947bcf898bbbb1ff8d40cf8c45ce88dc5bca39d89b4578ddfa508 Oct 04 03:32:18 crc kubenswrapper[4964]: I1004 03:32:18.623566 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"0e479211-b081-4b46-93b3-e1ae824dd73a","Type":"ContainerStarted","Data":"7c576b11912e76cb37ed9bde9d46f53475d1c0b3605702b0cc8b01e0f6941b3a"} Oct 04 03:32:18 crc kubenswrapper[4964]: I1004 03:32:18.623899 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"0e479211-b081-4b46-93b3-e1ae824dd73a","Type":"ContainerStarted","Data":"d9185a5579f947bcf898bbbb1ff8d40cf8c45ce88dc5bca39d89b4578ddfa508"} Oct 04 03:32:18 crc kubenswrapper[4964]: I1004 03:32:18.862790 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd656502-4058-4689-9474-b5b16bed8695" path="/var/lib/kubelet/pods/bd656502-4058-4689-9474-b5b16bed8695/volumes" Oct 04 03:32:19 crc kubenswrapper[4964]: I1004 03:32:19.640728 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"0e479211-b081-4b46-93b3-e1ae824dd73a","Type":"ContainerStarted","Data":"fccc8a08759f67a8aa649c5acb5a5f7a6f1770f93c996d6580fbd9350177a9c4"} Oct 04 03:32:19 crc kubenswrapper[4964]: I1004 03:32:19.677993 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.677972783 podStartE2EDuration="3.677972783s" podCreationTimestamp="2025-10-04 03:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:32:19.672365704 +0000 UTC m=+3119.569324382" watchObservedRunningTime="2025-10-04 03:32:19.677972783 +0000 UTC m=+3119.574931432" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.450723 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.547305 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-config-data\") pod \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.547604 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jffnt\" (UniqueName: \"kubernetes.io/projected/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-kube-api-access-jffnt\") pod \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.547755 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-horizon-secret-key\") pod \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.547840 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-combined-ca-bundle\") pod \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.547927 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-scripts\") pod \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.548008 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-logs\") pod \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.548095 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-horizon-tls-certs\") pod \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\" (UID: \"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc\") " Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.548562 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-logs" (OuterVolumeSpecName: "logs") pod "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" (UID: "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.554842 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" (UID: "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.554877 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-kube-api-access-jffnt" (OuterVolumeSpecName: "kube-api-access-jffnt") pod "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" (UID: "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc"). InnerVolumeSpecName "kube-api-access-jffnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.576929 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-scripts" (OuterVolumeSpecName: "scripts") pod "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" (UID: "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.581045 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-config-data" (OuterVolumeSpecName: "config-data") pod "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" (UID: "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.590757 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" (UID: "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.603994 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" (UID: "66a3cc2b-2e02-460d-877b-a4e0a3fd8abc"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.649531 4964 generic.go:334] "Generic (PLEG): container finished" podID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerID="75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262" exitCode=137 Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.649885 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67d99bc788-zm78q" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.650019 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67d99bc788-zm78q" event={"ID":"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc","Type":"ContainerDied","Data":"75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262"} Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.650061 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67d99bc788-zm78q" event={"ID":"66a3cc2b-2e02-460d-877b-a4e0a3fd8abc","Type":"ContainerDied","Data":"c5f02cb8cd8dfe4eba9e8208ea2d34143ecd582315073ad90955b8be4817ffca"} Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.650080 4964 scope.go:117] "RemoveContainer" containerID="ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.650396 4964 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.650628 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.650656 4964 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.650669 4964 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-logs\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.650681 4964 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.650692 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.650705 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jffnt\" (UniqueName: \"kubernetes.io/projected/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc-kube-api-access-jffnt\") on node \"crc\" DevicePath \"\"" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.680592 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67d99bc788-zm78q"] Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.687712 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67d99bc788-zm78q"] Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.826286 4964 scope.go:117] "RemoveContainer" containerID="75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.843920 4964 scope.go:117] "RemoveContainer" containerID="ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7" Oct 04 03:32:20 crc kubenswrapper[4964]: E1004 03:32:20.844431 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7\": container with ID starting with ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7 not found: ID does not exist" containerID="ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.844461 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7"} err="failed to get container status \"ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7\": rpc error: code = NotFound desc = could not find container \"ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7\": container with ID starting with ac2c21b96f0b054e79a42217e3dd37ac63c998e39af0c30302c7e6484c3558a7 not found: ID does not exist" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.844482 4964 scope.go:117] "RemoveContainer" containerID="75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262" Oct 04 03:32:20 crc kubenswrapper[4964]: E1004 03:32:20.844793 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262\": container with ID starting with 75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262 not found: ID does not exist" containerID="75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.844841 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262"} err="failed to get container status \"75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262\": rpc error: code = NotFound desc = could not find container \"75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262\": container with ID starting with 75ce3f7ac04385521960386f6a1a7a6c041161ff627b8d8c088953808e5f3262 not found: ID does not exist" Oct 04 03:32:20 crc kubenswrapper[4964]: I1004 03:32:20.856169 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" path="/var/lib/kubelet/pods/66a3cc2b-2e02-460d-877b-a4e0a3fd8abc/volumes" Oct 04 03:32:21 crc kubenswrapper[4964]: I1004 03:32:21.968973 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 04 03:32:25 crc kubenswrapper[4964]: I1004 03:32:25.252083 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 04 03:32:26 crc kubenswrapper[4964]: I1004 03:32:26.846609 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:32:26 crc kubenswrapper[4964]: E1004 03:32:26.847553 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:32:27 crc kubenswrapper[4964]: I1004 03:32:27.265116 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 04 03:32:36 crc kubenswrapper[4964]: I1004 03:32:36.620978 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 04 03:32:37 crc kubenswrapper[4964]: I1004 03:32:37.845854 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:32:37 crc kubenswrapper[4964]: E1004 03:32:37.846542 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:32:38 crc kubenswrapper[4964]: I1004 03:32:38.646130 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 04 03:32:39 crc kubenswrapper[4964]: I1004 03:32:39.205074 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 04 03:32:52 crc kubenswrapper[4964]: I1004 03:32:52.846650 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:32:52 crc kubenswrapper[4964]: E1004 03:32:52.847400 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:33:06 crc kubenswrapper[4964]: I1004 03:33:06.846724 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:33:06 crc kubenswrapper[4964]: E1004 03:33:06.847665 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:33:21 crc kubenswrapper[4964]: I1004 03:33:21.846179 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:33:21 crc kubenswrapper[4964]: E1004 03:33:21.847525 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.701952 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 04 03:33:28 crc kubenswrapper[4964]: E1004 03:33:28.702817 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerName="horizon-log" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.702830 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerName="horizon-log" Oct 04 03:33:28 crc kubenswrapper[4964]: E1004 03:33:28.702848 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerName="horizon" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.702855 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerName="horizon" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.703034 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerName="horizon" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.703058 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a3cc2b-2e02-460d-877b-a4e0a3fd8abc" containerName="horizon-log" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.703657 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.707745 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-tbt6v" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.707762 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.708517 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.708611 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.731133 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.809607 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a803f4c5-eef4-426a-acea-039c19405797-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.809991 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a803f4c5-eef4-426a-acea-039c19405797-config-data\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.810049 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a803f4c5-eef4-426a-acea-039c19405797-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.810168 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.810217 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-599ws\" (UniqueName: \"kubernetes.io/projected/a803f4c5-eef4-426a-acea-039c19405797-kube-api-access-599ws\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.810252 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.810313 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a803f4c5-eef4-426a-acea-039c19405797-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.810423 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.810517 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.912581 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.913021 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a803f4c5-eef4-426a-acea-039c19405797-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.913172 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a803f4c5-eef4-426a-acea-039c19405797-config-data\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.913208 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a803f4c5-eef4-426a-acea-039c19405797-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.913276 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.913312 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-599ws\" (UniqueName: \"kubernetes.io/projected/a803f4c5-eef4-426a-acea-039c19405797-kube-api-access-599ws\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.913352 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.913400 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a803f4c5-eef4-426a-acea-039c19405797-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.913471 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.913839 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.914011 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a803f4c5-eef4-426a-acea-039c19405797-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.914399 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a803f4c5-eef4-426a-acea-039c19405797-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.914551 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a803f4c5-eef4-426a-acea-039c19405797-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.914668 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a803f4c5-eef4-426a-acea-039c19405797-config-data\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.922082 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.923105 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.923761 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.939109 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-599ws\" (UniqueName: \"kubernetes.io/projected/a803f4c5-eef4-426a-acea-039c19405797-kube-api-access-599ws\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:28 crc kubenswrapper[4964]: I1004 03:33:28.968869 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " pod="openstack/tempest-tests-tempest" Oct 04 03:33:29 crc kubenswrapper[4964]: I1004 03:33:29.022355 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 04 03:33:29 crc kubenswrapper[4964]: I1004 03:33:29.539276 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 04 03:33:30 crc kubenswrapper[4964]: I1004 03:33:30.444670 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a803f4c5-eef4-426a-acea-039c19405797","Type":"ContainerStarted","Data":"62cc3f35109656f97b1fab54c1108e7ebd7b8e2720c2102e6c5d9f5c6acf8b1b"} Oct 04 03:33:32 crc kubenswrapper[4964]: I1004 03:33:32.848915 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:33:32 crc kubenswrapper[4964]: E1004 03:33:32.849116 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:33:47 crc kubenswrapper[4964]: I1004 03:33:47.845414 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:33:58 crc kubenswrapper[4964]: E1004 03:33:58.327425 4964 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 04 03:33:58 crc kubenswrapper[4964]: E1004 03:33:58.328156 4964 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-599ws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(a803f4c5-eef4-426a-acea-039c19405797): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 03:33:58 crc kubenswrapper[4964]: E1004 03:33:58.329431 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="a803f4c5-eef4-426a-acea-039c19405797" Oct 04 03:33:58 crc kubenswrapper[4964]: I1004 03:33:58.780393 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"412be4ed2625de91b6f7a42ca3ca7bf1d47f40824ac7a8628ae9031e46bbd039"} Oct 04 03:33:58 crc kubenswrapper[4964]: E1004 03:33:58.781550 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="a803f4c5-eef4-426a-acea-039c19405797" Oct 04 03:34:13 crc kubenswrapper[4964]: I1004 03:34:13.980110 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a803f4c5-eef4-426a-acea-039c19405797","Type":"ContainerStarted","Data":"e782f5ae2bd13361426f77ba3c07203a64c14fe2285c79ede6317d265ecb4239"} Oct 04 03:36:04 crc kubenswrapper[4964]: I1004 03:36:04.449309 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:36:04 crc kubenswrapper[4964]: I1004 03:36:04.449926 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:36:34 crc kubenswrapper[4964]: I1004 03:36:34.448837 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:36:34 crc kubenswrapper[4964]: I1004 03:36:34.449557 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:36:38 crc kubenswrapper[4964]: I1004 03:36:38.572913 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="c8162366-d682-4f52-8402-4eff0411aae0" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.162:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 03:36:39 crc kubenswrapper[4964]: I1004 03:36:39.036858 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-volume-volume1-0" podUID="1eb33c04-b905-4472-839d-89537682be92" containerName="cinder-volume" probeResult="failure" output="Get \"http://10.217.0.237:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 03:36:39 crc kubenswrapper[4964]: I1004 03:36:39.094738 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-backup-0" podUID="68b726b8-57ae-48e1-ba37-9e0be7cc3f79" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.217.0.238:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.371641 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=154.635539489 podStartE2EDuration="3m17.371590087s" podCreationTimestamp="2025-10-04 03:33:27 +0000 UTC" firstStartedPulling="2025-10-04 03:33:29.548460516 +0000 UTC m=+3189.445419154" lastFinishedPulling="2025-10-04 03:34:12.284511114 +0000 UTC m=+3232.181469752" observedRunningTime="2025-10-04 03:34:14.009836121 +0000 UTC m=+3233.906794779" watchObservedRunningTime="2025-10-04 03:36:44.371590087 +0000 UTC m=+3384.268548735" Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.377975 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k9q9t"] Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.380295 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.404936 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9q9t"] Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.426426 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-catalog-content\") pod \"redhat-marketplace-k9q9t\" (UID: \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\") " pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.426957 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-utilities\") pod \"redhat-marketplace-k9q9t\" (UID: \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\") " pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.427046 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7tzr\" (UniqueName: \"kubernetes.io/projected/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-kube-api-access-z7tzr\") pod \"redhat-marketplace-k9q9t\" (UID: \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\") " pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.528701 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-catalog-content\") pod \"redhat-marketplace-k9q9t\" (UID: \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\") " pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.528759 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-utilities\") pod \"redhat-marketplace-k9q9t\" (UID: \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\") " pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.528789 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7tzr\" (UniqueName: \"kubernetes.io/projected/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-kube-api-access-z7tzr\") pod \"redhat-marketplace-k9q9t\" (UID: \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\") " pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.529331 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-catalog-content\") pod \"redhat-marketplace-k9q9t\" (UID: \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\") " pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.529426 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-utilities\") pod \"redhat-marketplace-k9q9t\" (UID: \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\") " pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.552706 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7tzr\" (UniqueName: \"kubernetes.io/projected/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-kube-api-access-z7tzr\") pod \"redhat-marketplace-k9q9t\" (UID: \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\") " pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:44 crc kubenswrapper[4964]: I1004 03:36:44.707411 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:45 crc kubenswrapper[4964]: I1004 03:36:45.181990 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9q9t"] Oct 04 03:36:45 crc kubenswrapper[4964]: W1004 03:36:45.197008 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad0cf426_4153_42c8_b1b7_8aa4d6f8264d.slice/crio-57b82d747deaf02d02fa0baf1939094a1fddf934ee4c21f58909af3e6b8f6420 WatchSource:0}: Error finding container 57b82d747deaf02d02fa0baf1939094a1fddf934ee4c21f58909af3e6b8f6420: Status 404 returned error can't find the container with id 57b82d747deaf02d02fa0baf1939094a1fddf934ee4c21f58909af3e6b8f6420 Oct 04 03:36:45 crc kubenswrapper[4964]: I1004 03:36:45.812146 4964 generic.go:334] "Generic (PLEG): container finished" podID="ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" containerID="d1c25986885d23796aea5a02f6e49c56d57612f52f4576c3b9c88d86e1c57448" exitCode=0 Oct 04 03:36:45 crc kubenswrapper[4964]: I1004 03:36:45.812292 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9q9t" event={"ID":"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d","Type":"ContainerDied","Data":"d1c25986885d23796aea5a02f6e49c56d57612f52f4576c3b9c88d86e1c57448"} Oct 04 03:36:45 crc kubenswrapper[4964]: I1004 03:36:45.812607 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9q9t" event={"ID":"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d","Type":"ContainerStarted","Data":"57b82d747deaf02d02fa0baf1939094a1fddf934ee4c21f58909af3e6b8f6420"} Oct 04 03:36:45 crc kubenswrapper[4964]: I1004 03:36:45.815055 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 03:36:47 crc kubenswrapper[4964]: I1004 03:36:47.836976 4964 generic.go:334] "Generic (PLEG): container finished" podID="ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" containerID="34946bdb358c2886cb5f797f889182b95f0140ec444d04119f4882cc81268bfa" exitCode=0 Oct 04 03:36:47 crc kubenswrapper[4964]: I1004 03:36:47.837022 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9q9t" event={"ID":"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d","Type":"ContainerDied","Data":"34946bdb358c2886cb5f797f889182b95f0140ec444d04119f4882cc81268bfa"} Oct 04 03:36:48 crc kubenswrapper[4964]: I1004 03:36:48.861838 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9q9t" event={"ID":"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d","Type":"ContainerStarted","Data":"cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e"} Oct 04 03:36:48 crc kubenswrapper[4964]: I1004 03:36:48.885983 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k9q9t" podStartSLOduration=2.457361004 podStartE2EDuration="4.8859606s" podCreationTimestamp="2025-10-04 03:36:44 +0000 UTC" firstStartedPulling="2025-10-04 03:36:45.814782096 +0000 UTC m=+3385.711740734" lastFinishedPulling="2025-10-04 03:36:48.243381692 +0000 UTC m=+3388.140340330" observedRunningTime="2025-10-04 03:36:48.882782556 +0000 UTC m=+3388.779741234" watchObservedRunningTime="2025-10-04 03:36:48.8859606 +0000 UTC m=+3388.782919248" Oct 04 03:36:54 crc kubenswrapper[4964]: I1004 03:36:54.708171 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:54 crc kubenswrapper[4964]: I1004 03:36:54.708643 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:54 crc kubenswrapper[4964]: I1004 03:36:54.782907 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:54 crc kubenswrapper[4964]: I1004 03:36:54.986903 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:55 crc kubenswrapper[4964]: I1004 03:36:55.047294 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9q9t"] Oct 04 03:36:56 crc kubenswrapper[4964]: I1004 03:36:56.950346 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k9q9t" podUID="ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" containerName="registry-server" containerID="cri-o://cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e" gracePeriod=2 Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.509703 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.615799 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-catalog-content\") pod \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\" (UID: \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\") " Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.615903 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7tzr\" (UniqueName: \"kubernetes.io/projected/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-kube-api-access-z7tzr\") pod \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\" (UID: \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\") " Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.615995 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-utilities\") pod \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\" (UID: \"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d\") " Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.616699 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-utilities" (OuterVolumeSpecName: "utilities") pod "ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" (UID: "ad0cf426-4153-42c8-b1b7-8aa4d6f8264d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.630843 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-kube-api-access-z7tzr" (OuterVolumeSpecName: "kube-api-access-z7tzr") pod "ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" (UID: "ad0cf426-4153-42c8-b1b7-8aa4d6f8264d"). InnerVolumeSpecName "kube-api-access-z7tzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.637103 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" (UID: "ad0cf426-4153-42c8-b1b7-8aa4d6f8264d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.719352 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.719802 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7tzr\" (UniqueName: \"kubernetes.io/projected/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-kube-api-access-z7tzr\") on node \"crc\" DevicePath \"\"" Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.719836 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.964844 4964 generic.go:334] "Generic (PLEG): container finished" podID="ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" containerID="cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e" exitCode=0 Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.964908 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9q9t" event={"ID":"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d","Type":"ContainerDied","Data":"cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e"} Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.964956 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9q9t" event={"ID":"ad0cf426-4153-42c8-b1b7-8aa4d6f8264d","Type":"ContainerDied","Data":"57b82d747deaf02d02fa0baf1939094a1fddf934ee4c21f58909af3e6b8f6420"} Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.964997 4964 scope.go:117] "RemoveContainer" containerID="cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e" Oct 04 03:36:57 crc kubenswrapper[4964]: I1004 03:36:57.965238 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9q9t" Oct 04 03:36:58 crc kubenswrapper[4964]: I1004 03:36:58.000419 4964 scope.go:117] "RemoveContainer" containerID="34946bdb358c2886cb5f797f889182b95f0140ec444d04119f4882cc81268bfa" Oct 04 03:36:58 crc kubenswrapper[4964]: I1004 03:36:58.046749 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9q9t"] Oct 04 03:36:58 crc kubenswrapper[4964]: I1004 03:36:58.057457 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9q9t"] Oct 04 03:36:58 crc kubenswrapper[4964]: I1004 03:36:58.060702 4964 scope.go:117] "RemoveContainer" containerID="d1c25986885d23796aea5a02f6e49c56d57612f52f4576c3b9c88d86e1c57448" Oct 04 03:36:58 crc kubenswrapper[4964]: I1004 03:36:58.106108 4964 scope.go:117] "RemoveContainer" containerID="cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e" Oct 04 03:36:58 crc kubenswrapper[4964]: E1004 03:36:58.106597 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e\": container with ID starting with cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e not found: ID does not exist" containerID="cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e" Oct 04 03:36:58 crc kubenswrapper[4964]: I1004 03:36:58.106655 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e"} err="failed to get container status \"cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e\": rpc error: code = NotFound desc = could not find container \"cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e\": container with ID starting with cdc29f106cdcf67c9b63d8e210ac0c5526fddcab2650c8836863d7b58243f34e not found: ID does not exist" Oct 04 03:36:58 crc kubenswrapper[4964]: I1004 03:36:58.106682 4964 scope.go:117] "RemoveContainer" containerID="34946bdb358c2886cb5f797f889182b95f0140ec444d04119f4882cc81268bfa" Oct 04 03:36:58 crc kubenswrapper[4964]: E1004 03:36:58.107004 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34946bdb358c2886cb5f797f889182b95f0140ec444d04119f4882cc81268bfa\": container with ID starting with 34946bdb358c2886cb5f797f889182b95f0140ec444d04119f4882cc81268bfa not found: ID does not exist" containerID="34946bdb358c2886cb5f797f889182b95f0140ec444d04119f4882cc81268bfa" Oct 04 03:36:58 crc kubenswrapper[4964]: I1004 03:36:58.107035 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34946bdb358c2886cb5f797f889182b95f0140ec444d04119f4882cc81268bfa"} err="failed to get container status \"34946bdb358c2886cb5f797f889182b95f0140ec444d04119f4882cc81268bfa\": rpc error: code = NotFound desc = could not find container \"34946bdb358c2886cb5f797f889182b95f0140ec444d04119f4882cc81268bfa\": container with ID starting with 34946bdb358c2886cb5f797f889182b95f0140ec444d04119f4882cc81268bfa not found: ID does not exist" Oct 04 03:36:58 crc kubenswrapper[4964]: I1004 03:36:58.107055 4964 scope.go:117] "RemoveContainer" containerID="d1c25986885d23796aea5a02f6e49c56d57612f52f4576c3b9c88d86e1c57448" Oct 04 03:36:58 crc kubenswrapper[4964]: E1004 03:36:58.107337 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c25986885d23796aea5a02f6e49c56d57612f52f4576c3b9c88d86e1c57448\": container with ID starting with d1c25986885d23796aea5a02f6e49c56d57612f52f4576c3b9c88d86e1c57448 not found: ID does not exist" containerID="d1c25986885d23796aea5a02f6e49c56d57612f52f4576c3b9c88d86e1c57448" Oct 04 03:36:58 crc kubenswrapper[4964]: I1004 03:36:58.107368 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c25986885d23796aea5a02f6e49c56d57612f52f4576c3b9c88d86e1c57448"} err="failed to get container status \"d1c25986885d23796aea5a02f6e49c56d57612f52f4576c3b9c88d86e1c57448\": rpc error: code = NotFound desc = could not find container \"d1c25986885d23796aea5a02f6e49c56d57612f52f4576c3b9c88d86e1c57448\": container with ID starting with d1c25986885d23796aea5a02f6e49c56d57612f52f4576c3b9c88d86e1c57448 not found: ID does not exist" Oct 04 03:36:58 crc kubenswrapper[4964]: I1004 03:36:58.866590 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" path="/var/lib/kubelet/pods/ad0cf426-4153-42c8-b1b7-8aa4d6f8264d/volumes" Oct 04 03:37:04 crc kubenswrapper[4964]: I1004 03:37:04.449510 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:37:04 crc kubenswrapper[4964]: I1004 03:37:04.450492 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:37:04 crc kubenswrapper[4964]: I1004 03:37:04.450580 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 03:37:04 crc kubenswrapper[4964]: I1004 03:37:04.451888 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"412be4ed2625de91b6f7a42ca3ca7bf1d47f40824ac7a8628ae9031e46bbd039"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 03:37:04 crc kubenswrapper[4964]: I1004 03:37:04.452029 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://412be4ed2625de91b6f7a42ca3ca7bf1d47f40824ac7a8628ae9031e46bbd039" gracePeriod=600 Oct 04 03:37:05 crc kubenswrapper[4964]: I1004 03:37:05.051178 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="412be4ed2625de91b6f7a42ca3ca7bf1d47f40824ac7a8628ae9031e46bbd039" exitCode=0 Oct 04 03:37:05 crc kubenswrapper[4964]: I1004 03:37:05.051219 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"412be4ed2625de91b6f7a42ca3ca7bf1d47f40824ac7a8628ae9031e46bbd039"} Oct 04 03:37:05 crc kubenswrapper[4964]: I1004 03:37:05.051572 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204"} Oct 04 03:37:05 crc kubenswrapper[4964]: I1004 03:37:05.051597 4964 scope.go:117] "RemoveContainer" containerID="93ddb586de4bf31929fa95d1f28c3478755f26ada4bb1444f510b5c56bd9af5d" Oct 04 03:39:04 crc kubenswrapper[4964]: I1004 03:39:04.448825 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:39:04 crc kubenswrapper[4964]: I1004 03:39:04.449329 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:39:34 crc kubenswrapper[4964]: I1004 03:39:34.449567 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:39:34 crc kubenswrapper[4964]: I1004 03:39:34.450181 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:40:04 crc kubenswrapper[4964]: I1004 03:40:04.449452 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:40:04 crc kubenswrapper[4964]: I1004 03:40:04.450062 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:40:04 crc kubenswrapper[4964]: I1004 03:40:04.450119 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 03:40:04 crc kubenswrapper[4964]: I1004 03:40:04.451024 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 03:40:04 crc kubenswrapper[4964]: I1004 03:40:04.451084 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" gracePeriod=600 Oct 04 03:40:04 crc kubenswrapper[4964]: E1004 03:40:04.594220 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:40:04 crc kubenswrapper[4964]: I1004 03:40:04.864188 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" exitCode=0 Oct 04 03:40:04 crc kubenswrapper[4964]: I1004 03:40:04.864281 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204"} Oct 04 03:40:04 crc kubenswrapper[4964]: I1004 03:40:04.864368 4964 scope.go:117] "RemoveContainer" containerID="412be4ed2625de91b6f7a42ca3ca7bf1d47f40824ac7a8628ae9031e46bbd039" Oct 04 03:40:04 crc kubenswrapper[4964]: I1004 03:40:04.865085 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:40:04 crc kubenswrapper[4964]: E1004 03:40:04.865380 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:40:16 crc kubenswrapper[4964]: I1004 03:40:16.845305 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:40:16 crc kubenswrapper[4964]: E1004 03:40:16.845949 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.668689 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-txzbs"] Oct 04 03:40:19 crc kubenswrapper[4964]: E1004 03:40:19.671206 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" containerName="registry-server" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.671328 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" containerName="registry-server" Oct 04 03:40:19 crc kubenswrapper[4964]: E1004 03:40:19.671425 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" containerName="extract-utilities" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.671501 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" containerName="extract-utilities" Oct 04 03:40:19 crc kubenswrapper[4964]: E1004 03:40:19.671589 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" containerName="extract-content" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.672136 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" containerName="extract-content" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.672494 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad0cf426-4153-42c8-b1b7-8aa4d6f8264d" containerName="registry-server" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.674361 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.690484 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-txzbs"] Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.813696 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/147c9eea-d780-4708-b7ac-eb8563c3899c-catalog-content\") pod \"community-operators-txzbs\" (UID: \"147c9eea-d780-4708-b7ac-eb8563c3899c\") " pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.813862 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/147c9eea-d780-4708-b7ac-eb8563c3899c-utilities\") pod \"community-operators-txzbs\" (UID: \"147c9eea-d780-4708-b7ac-eb8563c3899c\") " pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.814146 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98zh\" (UniqueName: \"kubernetes.io/projected/147c9eea-d780-4708-b7ac-eb8563c3899c-kube-api-access-s98zh\") pod \"community-operators-txzbs\" (UID: \"147c9eea-d780-4708-b7ac-eb8563c3899c\") " pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.915841 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98zh\" (UniqueName: \"kubernetes.io/projected/147c9eea-d780-4708-b7ac-eb8563c3899c-kube-api-access-s98zh\") pod \"community-operators-txzbs\" (UID: \"147c9eea-d780-4708-b7ac-eb8563c3899c\") " pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.915962 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/147c9eea-d780-4708-b7ac-eb8563c3899c-catalog-content\") pod \"community-operators-txzbs\" (UID: \"147c9eea-d780-4708-b7ac-eb8563c3899c\") " pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.916077 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/147c9eea-d780-4708-b7ac-eb8563c3899c-utilities\") pod \"community-operators-txzbs\" (UID: \"147c9eea-d780-4708-b7ac-eb8563c3899c\") " pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.916629 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/147c9eea-d780-4708-b7ac-eb8563c3899c-catalog-content\") pod \"community-operators-txzbs\" (UID: \"147c9eea-d780-4708-b7ac-eb8563c3899c\") " pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.916861 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/147c9eea-d780-4708-b7ac-eb8563c3899c-utilities\") pod \"community-operators-txzbs\" (UID: \"147c9eea-d780-4708-b7ac-eb8563c3899c\") " pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:19 crc kubenswrapper[4964]: I1004 03:40:19.939870 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98zh\" (UniqueName: \"kubernetes.io/projected/147c9eea-d780-4708-b7ac-eb8563c3899c-kube-api-access-s98zh\") pod \"community-operators-txzbs\" (UID: \"147c9eea-d780-4708-b7ac-eb8563c3899c\") " pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:20 crc kubenswrapper[4964]: I1004 03:40:20.002821 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:21 crc kubenswrapper[4964]: I1004 03:40:21.347297 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-txzbs"] Oct 04 03:40:22 crc kubenswrapper[4964]: I1004 03:40:22.037030 4964 generic.go:334] "Generic (PLEG): container finished" podID="147c9eea-d780-4708-b7ac-eb8563c3899c" containerID="7ccc57b53e41f2a268b2ab2ce10ba91b8033b12d22b7e8287fe035de32b10955" exitCode=0 Oct 04 03:40:22 crc kubenswrapper[4964]: I1004 03:40:22.037101 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzbs" event={"ID":"147c9eea-d780-4708-b7ac-eb8563c3899c","Type":"ContainerDied","Data":"7ccc57b53e41f2a268b2ab2ce10ba91b8033b12d22b7e8287fe035de32b10955"} Oct 04 03:40:22 crc kubenswrapper[4964]: I1004 03:40:22.037548 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzbs" event={"ID":"147c9eea-d780-4708-b7ac-eb8563c3899c","Type":"ContainerStarted","Data":"15db9186408f48af4919ec8484071135b7cf76b4fd6d6d98adbf566f25dcff20"} Oct 04 03:40:23 crc kubenswrapper[4964]: I1004 03:40:23.052181 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzbs" event={"ID":"147c9eea-d780-4708-b7ac-eb8563c3899c","Type":"ContainerStarted","Data":"4d5d8e2e2bc093895aca57e503ea36cda47947ef5b4a7cc45daaa4eba028fe2b"} Oct 04 03:40:24 crc kubenswrapper[4964]: I1004 03:40:24.068876 4964 generic.go:334] "Generic (PLEG): container finished" podID="147c9eea-d780-4708-b7ac-eb8563c3899c" containerID="4d5d8e2e2bc093895aca57e503ea36cda47947ef5b4a7cc45daaa4eba028fe2b" exitCode=0 Oct 04 03:40:24 crc kubenswrapper[4964]: I1004 03:40:24.068967 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzbs" event={"ID":"147c9eea-d780-4708-b7ac-eb8563c3899c","Type":"ContainerDied","Data":"4d5d8e2e2bc093895aca57e503ea36cda47947ef5b4a7cc45daaa4eba028fe2b"} Oct 04 03:40:25 crc kubenswrapper[4964]: I1004 03:40:25.090791 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzbs" event={"ID":"147c9eea-d780-4708-b7ac-eb8563c3899c","Type":"ContainerStarted","Data":"2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa"} Oct 04 03:40:25 crc kubenswrapper[4964]: I1004 03:40:25.120055 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-txzbs" podStartSLOduration=3.678943241 podStartE2EDuration="6.120031068s" podCreationTimestamp="2025-10-04 03:40:19 +0000 UTC" firstStartedPulling="2025-10-04 03:40:22.039567238 +0000 UTC m=+3601.936525886" lastFinishedPulling="2025-10-04 03:40:24.480655035 +0000 UTC m=+3604.377613713" observedRunningTime="2025-10-04 03:40:25.1171217 +0000 UTC m=+3605.014080348" watchObservedRunningTime="2025-10-04 03:40:25.120031068 +0000 UTC m=+3605.016989726" Oct 04 03:40:29 crc kubenswrapper[4964]: I1004 03:40:29.845574 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:40:29 crc kubenswrapper[4964]: E1004 03:40:29.847123 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:40:30 crc kubenswrapper[4964]: I1004 03:40:30.004143 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:30 crc kubenswrapper[4964]: I1004 03:40:30.004222 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:30 crc kubenswrapper[4964]: I1004 03:40:30.099402 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:30 crc kubenswrapper[4964]: I1004 03:40:30.219175 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:30 crc kubenswrapper[4964]: I1004 03:40:30.358762 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-txzbs"] Oct 04 03:40:32 crc kubenswrapper[4964]: I1004 03:40:32.171791 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-txzbs" podUID="147c9eea-d780-4708-b7ac-eb8563c3899c" containerName="registry-server" containerID="cri-o://2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa" gracePeriod=2 Oct 04 03:40:32 crc kubenswrapper[4964]: I1004 03:40:32.734645 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:32 crc kubenswrapper[4964]: I1004 03:40:32.823554 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s98zh\" (UniqueName: \"kubernetes.io/projected/147c9eea-d780-4708-b7ac-eb8563c3899c-kube-api-access-s98zh\") pod \"147c9eea-d780-4708-b7ac-eb8563c3899c\" (UID: \"147c9eea-d780-4708-b7ac-eb8563c3899c\") " Oct 04 03:40:32 crc kubenswrapper[4964]: I1004 03:40:32.823981 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/147c9eea-d780-4708-b7ac-eb8563c3899c-utilities\") pod \"147c9eea-d780-4708-b7ac-eb8563c3899c\" (UID: \"147c9eea-d780-4708-b7ac-eb8563c3899c\") " Oct 04 03:40:32 crc kubenswrapper[4964]: I1004 03:40:32.824178 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/147c9eea-d780-4708-b7ac-eb8563c3899c-catalog-content\") pod \"147c9eea-d780-4708-b7ac-eb8563c3899c\" (UID: \"147c9eea-d780-4708-b7ac-eb8563c3899c\") " Oct 04 03:40:32 crc kubenswrapper[4964]: I1004 03:40:32.824984 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/147c9eea-d780-4708-b7ac-eb8563c3899c-utilities" (OuterVolumeSpecName: "utilities") pod "147c9eea-d780-4708-b7ac-eb8563c3899c" (UID: "147c9eea-d780-4708-b7ac-eb8563c3899c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:40:32 crc kubenswrapper[4964]: I1004 03:40:32.825781 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/147c9eea-d780-4708-b7ac-eb8563c3899c-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:40:32 crc kubenswrapper[4964]: I1004 03:40:32.833401 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147c9eea-d780-4708-b7ac-eb8563c3899c-kube-api-access-s98zh" (OuterVolumeSpecName: "kube-api-access-s98zh") pod "147c9eea-d780-4708-b7ac-eb8563c3899c" (UID: "147c9eea-d780-4708-b7ac-eb8563c3899c"). InnerVolumeSpecName "kube-api-access-s98zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:40:32 crc kubenswrapper[4964]: I1004 03:40:32.909985 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/147c9eea-d780-4708-b7ac-eb8563c3899c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "147c9eea-d780-4708-b7ac-eb8563c3899c" (UID: "147c9eea-d780-4708-b7ac-eb8563c3899c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:40:32 crc kubenswrapper[4964]: I1004 03:40:32.927979 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/147c9eea-d780-4708-b7ac-eb8563c3899c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:40:32 crc kubenswrapper[4964]: I1004 03:40:32.928018 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s98zh\" (UniqueName: \"kubernetes.io/projected/147c9eea-d780-4708-b7ac-eb8563c3899c-kube-api-access-s98zh\") on node \"crc\" DevicePath \"\"" Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.182349 4964 generic.go:334] "Generic (PLEG): container finished" podID="147c9eea-d780-4708-b7ac-eb8563c3899c" containerID="2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa" exitCode=0 Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.182436 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txzbs" Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.182452 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzbs" event={"ID":"147c9eea-d780-4708-b7ac-eb8563c3899c","Type":"ContainerDied","Data":"2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa"} Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.182699 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzbs" event={"ID":"147c9eea-d780-4708-b7ac-eb8563c3899c","Type":"ContainerDied","Data":"15db9186408f48af4919ec8484071135b7cf76b4fd6d6d98adbf566f25dcff20"} Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.182745 4964 scope.go:117] "RemoveContainer" containerID="2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa" Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.217816 4964 scope.go:117] "RemoveContainer" containerID="4d5d8e2e2bc093895aca57e503ea36cda47947ef5b4a7cc45daaa4eba028fe2b" Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.221814 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-txzbs"] Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.233077 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-txzbs"] Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.250737 4964 scope.go:117] "RemoveContainer" containerID="7ccc57b53e41f2a268b2ab2ce10ba91b8033b12d22b7e8287fe035de32b10955" Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.309041 4964 scope.go:117] "RemoveContainer" containerID="2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa" Oct 04 03:40:33 crc kubenswrapper[4964]: E1004 03:40:33.309485 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa\": container with ID starting with 2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa not found: ID does not exist" containerID="2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa" Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.309535 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa"} err="failed to get container status \"2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa\": rpc error: code = NotFound desc = could not find container \"2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa\": container with ID starting with 2a7cb650067ae528379fe94479ac0b723926717cffcfa318d0a92f50342171aa not found: ID does not exist" Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.309567 4964 scope.go:117] "RemoveContainer" containerID="4d5d8e2e2bc093895aca57e503ea36cda47947ef5b4a7cc45daaa4eba028fe2b" Oct 04 03:40:33 crc kubenswrapper[4964]: E1004 03:40:33.309989 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d5d8e2e2bc093895aca57e503ea36cda47947ef5b4a7cc45daaa4eba028fe2b\": container with ID starting with 4d5d8e2e2bc093895aca57e503ea36cda47947ef5b4a7cc45daaa4eba028fe2b not found: ID does not exist" containerID="4d5d8e2e2bc093895aca57e503ea36cda47947ef5b4a7cc45daaa4eba028fe2b" Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.310044 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d5d8e2e2bc093895aca57e503ea36cda47947ef5b4a7cc45daaa4eba028fe2b"} err="failed to get container status \"4d5d8e2e2bc093895aca57e503ea36cda47947ef5b4a7cc45daaa4eba028fe2b\": rpc error: code = NotFound desc = could not find container \"4d5d8e2e2bc093895aca57e503ea36cda47947ef5b4a7cc45daaa4eba028fe2b\": container with ID starting with 4d5d8e2e2bc093895aca57e503ea36cda47947ef5b4a7cc45daaa4eba028fe2b not found: ID does not exist" Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.310078 4964 scope.go:117] "RemoveContainer" containerID="7ccc57b53e41f2a268b2ab2ce10ba91b8033b12d22b7e8287fe035de32b10955" Oct 04 03:40:33 crc kubenswrapper[4964]: E1004 03:40:33.310376 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ccc57b53e41f2a268b2ab2ce10ba91b8033b12d22b7e8287fe035de32b10955\": container with ID starting with 7ccc57b53e41f2a268b2ab2ce10ba91b8033b12d22b7e8287fe035de32b10955 not found: ID does not exist" containerID="7ccc57b53e41f2a268b2ab2ce10ba91b8033b12d22b7e8287fe035de32b10955" Oct 04 03:40:33 crc kubenswrapper[4964]: I1004 03:40:33.310460 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ccc57b53e41f2a268b2ab2ce10ba91b8033b12d22b7e8287fe035de32b10955"} err="failed to get container status \"7ccc57b53e41f2a268b2ab2ce10ba91b8033b12d22b7e8287fe035de32b10955\": rpc error: code = NotFound desc = could not find container \"7ccc57b53e41f2a268b2ab2ce10ba91b8033b12d22b7e8287fe035de32b10955\": container with ID starting with 7ccc57b53e41f2a268b2ab2ce10ba91b8033b12d22b7e8287fe035de32b10955 not found: ID does not exist" Oct 04 03:40:34 crc kubenswrapper[4964]: I1004 03:40:34.856904 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="147c9eea-d780-4708-b7ac-eb8563c3899c" path="/var/lib/kubelet/pods/147c9eea-d780-4708-b7ac-eb8563c3899c/volumes" Oct 04 03:40:43 crc kubenswrapper[4964]: I1004 03:40:43.846013 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:40:43 crc kubenswrapper[4964]: E1004 03:40:43.846787 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:40:58 crc kubenswrapper[4964]: I1004 03:40:58.845878 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:40:58 crc kubenswrapper[4964]: E1004 03:40:58.846885 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:41:11 crc kubenswrapper[4964]: I1004 03:41:11.845210 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:41:11 crc kubenswrapper[4964]: E1004 03:41:11.846037 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:41:17 crc kubenswrapper[4964]: I1004 03:41:17.046920 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-snvm2"] Oct 04 03:41:17 crc kubenswrapper[4964]: I1004 03:41:17.055471 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-snvm2"] Oct 04 03:41:18 crc kubenswrapper[4964]: I1004 03:41:18.861720 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b92794e-7e78-4c34-8967-4c7354aa9df2" path="/var/lib/kubelet/pods/4b92794e-7e78-4c34-8967-4c7354aa9df2/volumes" Oct 04 03:41:24 crc kubenswrapper[4964]: I1004 03:41:24.845882 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:41:24 crc kubenswrapper[4964]: E1004 03:41:24.846669 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:41:29 crc kubenswrapper[4964]: I1004 03:41:29.042053 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-4c88-account-create-2d7xt"] Oct 04 03:41:29 crc kubenswrapper[4964]: I1004 03:41:29.051127 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-4c88-account-create-2d7xt"] Oct 04 03:41:30 crc kubenswrapper[4964]: I1004 03:41:30.013031 4964 scope.go:117] "RemoveContainer" containerID="4a43c8a8c9c57712d385a29d486c1d47a82d1d2a57dfe5b518b1652837361e58" Oct 04 03:41:30 crc kubenswrapper[4964]: I1004 03:41:30.867477 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c65481e-f1b3-4877-aaea-cdbb32849b0a" path="/var/lib/kubelet/pods/2c65481e-f1b3-4877-aaea-cdbb32849b0a/volumes" Oct 04 03:41:39 crc kubenswrapper[4964]: I1004 03:41:39.846544 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:41:39 crc kubenswrapper[4964]: E1004 03:41:39.847580 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.707732 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l68nq"] Oct 04 03:41:48 crc kubenswrapper[4964]: E1004 03:41:48.708822 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147c9eea-d780-4708-b7ac-eb8563c3899c" containerName="extract-utilities" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.708841 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="147c9eea-d780-4708-b7ac-eb8563c3899c" containerName="extract-utilities" Oct 04 03:41:48 crc kubenswrapper[4964]: E1004 03:41:48.708885 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147c9eea-d780-4708-b7ac-eb8563c3899c" containerName="extract-content" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.708894 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="147c9eea-d780-4708-b7ac-eb8563c3899c" containerName="extract-content" Oct 04 03:41:48 crc kubenswrapper[4964]: E1004 03:41:48.708913 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147c9eea-d780-4708-b7ac-eb8563c3899c" containerName="registry-server" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.708921 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="147c9eea-d780-4708-b7ac-eb8563c3899c" containerName="registry-server" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.709154 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="147c9eea-d780-4708-b7ac-eb8563c3899c" containerName="registry-server" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.710768 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.736002 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l68nq"] Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.804318 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-catalog-content\") pod \"certified-operators-l68nq\" (UID: \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\") " pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.804377 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-utilities\") pod \"certified-operators-l68nq\" (UID: \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\") " pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.804637 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfzv9\" (UniqueName: \"kubernetes.io/projected/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-kube-api-access-cfzv9\") pod \"certified-operators-l68nq\" (UID: \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\") " pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.906359 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfzv9\" (UniqueName: \"kubernetes.io/projected/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-kube-api-access-cfzv9\") pod \"certified-operators-l68nq\" (UID: \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\") " pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.906902 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-catalog-content\") pod \"certified-operators-l68nq\" (UID: \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\") " pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.906956 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-utilities\") pod \"certified-operators-l68nq\" (UID: \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\") " pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.907590 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-catalog-content\") pod \"certified-operators-l68nq\" (UID: \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\") " pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.907599 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-utilities\") pod \"certified-operators-l68nq\" (UID: \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\") " pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:48 crc kubenswrapper[4964]: I1004 03:41:48.926602 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfzv9\" (UniqueName: \"kubernetes.io/projected/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-kube-api-access-cfzv9\") pod \"certified-operators-l68nq\" (UID: \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\") " pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:49 crc kubenswrapper[4964]: I1004 03:41:49.031667 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:49 crc kubenswrapper[4964]: I1004 03:41:49.552714 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l68nq"] Oct 04 03:41:49 crc kubenswrapper[4964]: I1004 03:41:49.961188 4964 generic.go:334] "Generic (PLEG): container finished" podID="d395c7ac-5dde-4b8b-a8a1-77712a7a1462" containerID="abd400d2bf160b698b825e53dabe5adfff854c813562770b4b5722290112fe99" exitCode=0 Oct 04 03:41:49 crc kubenswrapper[4964]: I1004 03:41:49.961276 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l68nq" event={"ID":"d395c7ac-5dde-4b8b-a8a1-77712a7a1462","Type":"ContainerDied","Data":"abd400d2bf160b698b825e53dabe5adfff854c813562770b4b5722290112fe99"} Oct 04 03:41:49 crc kubenswrapper[4964]: I1004 03:41:49.961533 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l68nq" event={"ID":"d395c7ac-5dde-4b8b-a8a1-77712a7a1462","Type":"ContainerStarted","Data":"5eba88e4a1449317d9575bf9f097775d380101af998b8efaa36c8e2e62f0efb0"} Oct 04 03:41:49 crc kubenswrapper[4964]: I1004 03:41:49.963570 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 03:41:51 crc kubenswrapper[4964]: I1004 03:41:51.057472 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-fqh7q"] Oct 04 03:41:51 crc kubenswrapper[4964]: I1004 03:41:51.075370 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-fqh7q"] Oct 04 03:41:51 crc kubenswrapper[4964]: I1004 03:41:51.981209 4964 generic.go:334] "Generic (PLEG): container finished" podID="d395c7ac-5dde-4b8b-a8a1-77712a7a1462" containerID="0d0fc8214bf0494e9b973706486d25b41f8d6f52889f4e9a446da8ac5ea3ab26" exitCode=0 Oct 04 03:41:51 crc kubenswrapper[4964]: I1004 03:41:51.981267 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l68nq" event={"ID":"d395c7ac-5dde-4b8b-a8a1-77712a7a1462","Type":"ContainerDied","Data":"0d0fc8214bf0494e9b973706486d25b41f8d6f52889f4e9a446da8ac5ea3ab26"} Oct 04 03:41:52 crc kubenswrapper[4964]: I1004 03:41:52.860018 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47754785-d3d3-461f-a710-cb74b48b1a3e" path="/var/lib/kubelet/pods/47754785-d3d3-461f-a710-cb74b48b1a3e/volumes" Oct 04 03:41:52 crc kubenswrapper[4964]: I1004 03:41:52.991354 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l68nq" event={"ID":"d395c7ac-5dde-4b8b-a8a1-77712a7a1462","Type":"ContainerStarted","Data":"61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b"} Oct 04 03:41:53 crc kubenswrapper[4964]: I1004 03:41:53.013440 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l68nq" podStartSLOduration=2.598796449 podStartE2EDuration="5.013420163s" podCreationTimestamp="2025-10-04 03:41:48 +0000 UTC" firstStartedPulling="2025-10-04 03:41:49.963345961 +0000 UTC m=+3689.860304599" lastFinishedPulling="2025-10-04 03:41:52.377969635 +0000 UTC m=+3692.274928313" observedRunningTime="2025-10-04 03:41:53.008744839 +0000 UTC m=+3692.905703487" watchObservedRunningTime="2025-10-04 03:41:53.013420163 +0000 UTC m=+3692.910378821" Oct 04 03:41:54 crc kubenswrapper[4964]: I1004 03:41:54.845770 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:41:54 crc kubenswrapper[4964]: E1004 03:41:54.846315 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:41:59 crc kubenswrapper[4964]: I1004 03:41:59.033210 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:59 crc kubenswrapper[4964]: I1004 03:41:59.033893 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:59 crc kubenswrapper[4964]: I1004 03:41:59.098499 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:59 crc kubenswrapper[4964]: I1004 03:41:59.156001 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:41:59 crc kubenswrapper[4964]: I1004 03:41:59.355668 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l68nq"] Oct 04 03:42:01 crc kubenswrapper[4964]: I1004 03:42:01.073110 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l68nq" podUID="d395c7ac-5dde-4b8b-a8a1-77712a7a1462" containerName="registry-server" containerID="cri-o://61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b" gracePeriod=2 Oct 04 03:42:01 crc kubenswrapper[4964]: I1004 03:42:01.736897 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:42:01 crc kubenswrapper[4964]: I1004 03:42:01.777319 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-catalog-content\") pod \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\" (UID: \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\") " Oct 04 03:42:01 crc kubenswrapper[4964]: I1004 03:42:01.777697 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfzv9\" (UniqueName: \"kubernetes.io/projected/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-kube-api-access-cfzv9\") pod \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\" (UID: \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\") " Oct 04 03:42:01 crc kubenswrapper[4964]: I1004 03:42:01.777992 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-utilities\") pod \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\" (UID: \"d395c7ac-5dde-4b8b-a8a1-77712a7a1462\") " Oct 04 03:42:01 crc kubenswrapper[4964]: I1004 03:42:01.779199 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-utilities" (OuterVolumeSpecName: "utilities") pod "d395c7ac-5dde-4b8b-a8a1-77712a7a1462" (UID: "d395c7ac-5dde-4b8b-a8a1-77712a7a1462"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:42:01 crc kubenswrapper[4964]: I1004 03:42:01.784121 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-kube-api-access-cfzv9" (OuterVolumeSpecName: "kube-api-access-cfzv9") pod "d395c7ac-5dde-4b8b-a8a1-77712a7a1462" (UID: "d395c7ac-5dde-4b8b-a8a1-77712a7a1462"). InnerVolumeSpecName "kube-api-access-cfzv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:42:01 crc kubenswrapper[4964]: I1004 03:42:01.881588 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:42:01 crc kubenswrapper[4964]: I1004 03:42:01.881633 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfzv9\" (UniqueName: \"kubernetes.io/projected/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-kube-api-access-cfzv9\") on node \"crc\" DevicePath \"\"" Oct 04 03:42:01 crc kubenswrapper[4964]: I1004 03:42:01.977431 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d395c7ac-5dde-4b8b-a8a1-77712a7a1462" (UID: "d395c7ac-5dde-4b8b-a8a1-77712a7a1462"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:42:01 crc kubenswrapper[4964]: I1004 03:42:01.983782 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d395c7ac-5dde-4b8b-a8a1-77712a7a1462-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.089904 4964 generic.go:334] "Generic (PLEG): container finished" podID="d395c7ac-5dde-4b8b-a8a1-77712a7a1462" containerID="61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b" exitCode=0 Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.089986 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l68nq" event={"ID":"d395c7ac-5dde-4b8b-a8a1-77712a7a1462","Type":"ContainerDied","Data":"61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b"} Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.090288 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l68nq" event={"ID":"d395c7ac-5dde-4b8b-a8a1-77712a7a1462","Type":"ContainerDied","Data":"5eba88e4a1449317d9575bf9f097775d380101af998b8efaa36c8e2e62f0efb0"} Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.090318 4964 scope.go:117] "RemoveContainer" containerID="61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b" Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.090016 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l68nq" Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.120351 4964 scope.go:117] "RemoveContainer" containerID="0d0fc8214bf0494e9b973706486d25b41f8d6f52889f4e9a446da8ac5ea3ab26" Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.142659 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l68nq"] Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.153373 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l68nq"] Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.166830 4964 scope.go:117] "RemoveContainer" containerID="abd400d2bf160b698b825e53dabe5adfff854c813562770b4b5722290112fe99" Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.198230 4964 scope.go:117] "RemoveContainer" containerID="61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b" Oct 04 03:42:02 crc kubenswrapper[4964]: E1004 03:42:02.198918 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b\": container with ID starting with 61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b not found: ID does not exist" containerID="61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b" Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.198979 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b"} err="failed to get container status \"61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b\": rpc error: code = NotFound desc = could not find container \"61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b\": container with ID starting with 61bd3c1cc149fad45af2418fd232c9d64d49c2cf44032c0f780213c736fc059b not found: ID does not exist" Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.199005 4964 scope.go:117] "RemoveContainer" containerID="0d0fc8214bf0494e9b973706486d25b41f8d6f52889f4e9a446da8ac5ea3ab26" Oct 04 03:42:02 crc kubenswrapper[4964]: E1004 03:42:02.199595 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d0fc8214bf0494e9b973706486d25b41f8d6f52889f4e9a446da8ac5ea3ab26\": container with ID starting with 0d0fc8214bf0494e9b973706486d25b41f8d6f52889f4e9a446da8ac5ea3ab26 not found: ID does not exist" containerID="0d0fc8214bf0494e9b973706486d25b41f8d6f52889f4e9a446da8ac5ea3ab26" Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.199715 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d0fc8214bf0494e9b973706486d25b41f8d6f52889f4e9a446da8ac5ea3ab26"} err="failed to get container status \"0d0fc8214bf0494e9b973706486d25b41f8d6f52889f4e9a446da8ac5ea3ab26\": rpc error: code = NotFound desc = could not find container \"0d0fc8214bf0494e9b973706486d25b41f8d6f52889f4e9a446da8ac5ea3ab26\": container with ID starting with 0d0fc8214bf0494e9b973706486d25b41f8d6f52889f4e9a446da8ac5ea3ab26 not found: ID does not exist" Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.199769 4964 scope.go:117] "RemoveContainer" containerID="abd400d2bf160b698b825e53dabe5adfff854c813562770b4b5722290112fe99" Oct 04 03:42:02 crc kubenswrapper[4964]: E1004 03:42:02.200304 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abd400d2bf160b698b825e53dabe5adfff854c813562770b4b5722290112fe99\": container with ID starting with abd400d2bf160b698b825e53dabe5adfff854c813562770b4b5722290112fe99 not found: ID does not exist" containerID="abd400d2bf160b698b825e53dabe5adfff854c813562770b4b5722290112fe99" Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.200358 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abd400d2bf160b698b825e53dabe5adfff854c813562770b4b5722290112fe99"} err="failed to get container status \"abd400d2bf160b698b825e53dabe5adfff854c813562770b4b5722290112fe99\": rpc error: code = NotFound desc = could not find container \"abd400d2bf160b698b825e53dabe5adfff854c813562770b4b5722290112fe99\": container with ID starting with abd400d2bf160b698b825e53dabe5adfff854c813562770b4b5722290112fe99 not found: ID does not exist" Oct 04 03:42:02 crc kubenswrapper[4964]: I1004 03:42:02.856709 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d395c7ac-5dde-4b8b-a8a1-77712a7a1462" path="/var/lib/kubelet/pods/d395c7ac-5dde-4b8b-a8a1-77712a7a1462/volumes" Oct 04 03:42:08 crc kubenswrapper[4964]: I1004 03:42:08.846115 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:42:08 crc kubenswrapper[4964]: E1004 03:42:08.847073 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:42:21 crc kubenswrapper[4964]: I1004 03:42:21.846077 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:42:21 crc kubenswrapper[4964]: E1004 03:42:21.847330 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:42:30 crc kubenswrapper[4964]: I1004 03:42:30.112759 4964 scope.go:117] "RemoveContainer" containerID="04e69927611ac904588a58a6ffd367f6ef1f5b2c045116cd182b6a84c21945d9" Oct 04 03:42:30 crc kubenswrapper[4964]: I1004 03:42:30.179449 4964 scope.go:117] "RemoveContainer" containerID="fe0cad209f2d6eeebc1aaa5a4feffecdafea16bb39eee03ce7b50ab84f12e85d" Oct 04 03:42:34 crc kubenswrapper[4964]: I1004 03:42:34.845444 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:42:34 crc kubenswrapper[4964]: E1004 03:42:34.846211 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:42:47 crc kubenswrapper[4964]: I1004 03:42:47.845909 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:42:47 crc kubenswrapper[4964]: E1004 03:42:47.846558 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:43:02 crc kubenswrapper[4964]: I1004 03:43:02.845682 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:43:02 crc kubenswrapper[4964]: E1004 03:43:02.846536 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:43:13 crc kubenswrapper[4964]: I1004 03:43:13.846663 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:43:13 crc kubenswrapper[4964]: E1004 03:43:13.847992 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:43:27 crc kubenswrapper[4964]: I1004 03:43:27.845511 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:43:27 crc kubenswrapper[4964]: E1004 03:43:27.846186 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:43:41 crc kubenswrapper[4964]: I1004 03:43:41.846135 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:43:41 crc kubenswrapper[4964]: E1004 03:43:41.847129 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:43:52 crc kubenswrapper[4964]: I1004 03:43:52.846108 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:43:52 crc kubenswrapper[4964]: E1004 03:43:52.847006 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:44:04 crc kubenswrapper[4964]: I1004 03:44:04.846205 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:44:04 crc kubenswrapper[4964]: E1004 03:44:04.847343 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:44:18 crc kubenswrapper[4964]: I1004 03:44:18.845432 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:44:18 crc kubenswrapper[4964]: E1004 03:44:18.846659 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:44:30 crc kubenswrapper[4964]: I1004 03:44:30.853763 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:44:30 crc kubenswrapper[4964]: E1004 03:44:30.854840 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:44:44 crc kubenswrapper[4964]: I1004 03:44:44.846957 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:44:44 crc kubenswrapper[4964]: E1004 03:44:44.848302 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:44:57 crc kubenswrapper[4964]: I1004 03:44:57.844813 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:44:57 crc kubenswrapper[4964]: E1004 03:44:57.845534 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.160270 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr"] Oct 04 03:45:00 crc kubenswrapper[4964]: E1004 03:45:00.161965 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d395c7ac-5dde-4b8b-a8a1-77712a7a1462" containerName="extract-utilities" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.162051 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="d395c7ac-5dde-4b8b-a8a1-77712a7a1462" containerName="extract-utilities" Oct 04 03:45:00 crc kubenswrapper[4964]: E1004 03:45:00.162119 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d395c7ac-5dde-4b8b-a8a1-77712a7a1462" containerName="extract-content" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.162173 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="d395c7ac-5dde-4b8b-a8a1-77712a7a1462" containerName="extract-content" Oct 04 03:45:00 crc kubenswrapper[4964]: E1004 03:45:00.162262 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d395c7ac-5dde-4b8b-a8a1-77712a7a1462" containerName="registry-server" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.162316 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="d395c7ac-5dde-4b8b-a8a1-77712a7a1462" containerName="registry-server" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.162546 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="d395c7ac-5dde-4b8b-a8a1-77712a7a1462" containerName="registry-server" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.163271 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.165370 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.165473 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.170249 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr"] Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.296917 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpx7k\" (UniqueName: \"kubernetes.io/projected/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-kube-api-access-jpx7k\") pod \"collect-profiles-29325825-jmhwr\" (UID: \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.296979 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-config-volume\") pod \"collect-profiles-29325825-jmhwr\" (UID: \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.297010 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-secret-volume\") pod \"collect-profiles-29325825-jmhwr\" (UID: \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.399110 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpx7k\" (UniqueName: \"kubernetes.io/projected/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-kube-api-access-jpx7k\") pod \"collect-profiles-29325825-jmhwr\" (UID: \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.399175 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-config-volume\") pod \"collect-profiles-29325825-jmhwr\" (UID: \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.399210 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-secret-volume\") pod \"collect-profiles-29325825-jmhwr\" (UID: \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.401394 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-config-volume\") pod \"collect-profiles-29325825-jmhwr\" (UID: \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.408833 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-secret-volume\") pod \"collect-profiles-29325825-jmhwr\" (UID: \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.422231 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpx7k\" (UniqueName: \"kubernetes.io/projected/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-kube-api-access-jpx7k\") pod \"collect-profiles-29325825-jmhwr\" (UID: \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.490054 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:00 crc kubenswrapper[4964]: I1004 03:45:00.975981 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr"] Oct 04 03:45:01 crc kubenswrapper[4964]: I1004 03:45:01.809328 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" event={"ID":"c5863d05-1eee-4f45-b2c6-e7eab197fa1e","Type":"ContainerStarted","Data":"068d0de4f4247ec45fb89116398143342577911a2c35bbacd4860b6bde3ba595"} Oct 04 03:45:01 crc kubenswrapper[4964]: I1004 03:45:01.809751 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" event={"ID":"c5863d05-1eee-4f45-b2c6-e7eab197fa1e","Type":"ContainerStarted","Data":"6e529a19fc37c9ca143db831cd4de10a6cc19bd39758aebf62e1dfe4f78aacc8"} Oct 04 03:45:01 crc kubenswrapper[4964]: I1004 03:45:01.827092 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" podStartSLOduration=1.827076116 podStartE2EDuration="1.827076116s" podCreationTimestamp="2025-10-04 03:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 03:45:01.824090835 +0000 UTC m=+3881.721049503" watchObservedRunningTime="2025-10-04 03:45:01.827076116 +0000 UTC m=+3881.724034754" Oct 04 03:45:03 crc kubenswrapper[4964]: I1004 03:45:03.831849 4964 generic.go:334] "Generic (PLEG): container finished" podID="c5863d05-1eee-4f45-b2c6-e7eab197fa1e" containerID="068d0de4f4247ec45fb89116398143342577911a2c35bbacd4860b6bde3ba595" exitCode=0 Oct 04 03:45:03 crc kubenswrapper[4964]: I1004 03:45:03.831927 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" event={"ID":"c5863d05-1eee-4f45-b2c6-e7eab197fa1e","Type":"ContainerDied","Data":"068d0de4f4247ec45fb89116398143342577911a2c35bbacd4860b6bde3ba595"} Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.675975 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.807953 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-secret-volume\") pod \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\" (UID: \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\") " Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.808375 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-config-volume\") pod \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\" (UID: \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\") " Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.808410 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpx7k\" (UniqueName: \"kubernetes.io/projected/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-kube-api-access-jpx7k\") pod \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\" (UID: \"c5863d05-1eee-4f45-b2c6-e7eab197fa1e\") " Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.809531 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-config-volume" (OuterVolumeSpecName: "config-volume") pod "c5863d05-1eee-4f45-b2c6-e7eab197fa1e" (UID: "c5863d05-1eee-4f45-b2c6-e7eab197fa1e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.813398 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-kube-api-access-jpx7k" (OuterVolumeSpecName: "kube-api-access-jpx7k") pod "c5863d05-1eee-4f45-b2c6-e7eab197fa1e" (UID: "c5863d05-1eee-4f45-b2c6-e7eab197fa1e"). InnerVolumeSpecName "kube-api-access-jpx7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.816868 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c5863d05-1eee-4f45-b2c6-e7eab197fa1e" (UID: "c5863d05-1eee-4f45-b2c6-e7eab197fa1e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.855269 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" event={"ID":"c5863d05-1eee-4f45-b2c6-e7eab197fa1e","Type":"ContainerDied","Data":"6e529a19fc37c9ca143db831cd4de10a6cc19bd39758aebf62e1dfe4f78aacc8"} Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.855321 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e529a19fc37c9ca143db831cd4de10a6cc19bd39758aebf62e1dfe4f78aacc8" Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.855318 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325825-jmhwr" Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.911110 4964 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.911144 4964 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.911157 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpx7k\" (UniqueName: \"kubernetes.io/projected/c5863d05-1eee-4f45-b2c6-e7eab197fa1e-kube-api-access-jpx7k\") on node \"crc\" DevicePath \"\"" Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.920514 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj"] Oct 04 03:45:05 crc kubenswrapper[4964]: I1004 03:45:05.930945 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325780-8nhcj"] Oct 04 03:45:06 crc kubenswrapper[4964]: I1004 03:45:06.860650 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed3aad0-399d-4d5e-91cd-1fa0c65af611" path="/var/lib/kubelet/pods/aed3aad0-399d-4d5e-91cd-1fa0c65af611/volumes" Oct 04 03:45:11 crc kubenswrapper[4964]: I1004 03:45:11.845313 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:45:12 crc kubenswrapper[4964]: I1004 03:45:12.930541 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"6531ea61e3c01e8aa200a1fc0c735dfaaea9219b30aafdb50cd4c9a00356589d"} Oct 04 03:45:30 crc kubenswrapper[4964]: I1004 03:45:30.314044 4964 scope.go:117] "RemoveContainer" containerID="5975933ad7114cb3bc088caf90d6f71c8c9e419f59fd25e77c9713c3cb606ae6" Oct 04 03:47:34 crc kubenswrapper[4964]: I1004 03:47:34.449351 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:47:34 crc kubenswrapper[4964]: I1004 03:47:34.449999 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:48:04 crc kubenswrapper[4964]: I1004 03:48:04.449726 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:48:04 crc kubenswrapper[4964]: I1004 03:48:04.450306 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:48:34 crc kubenswrapper[4964]: I1004 03:48:34.449547 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:48:34 crc kubenswrapper[4964]: I1004 03:48:34.450216 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:48:34 crc kubenswrapper[4964]: I1004 03:48:34.450281 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 03:48:34 crc kubenswrapper[4964]: I1004 03:48:34.451244 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6531ea61e3c01e8aa200a1fc0c735dfaaea9219b30aafdb50cd4c9a00356589d"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 03:48:34 crc kubenswrapper[4964]: I1004 03:48:34.451361 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://6531ea61e3c01e8aa200a1fc0c735dfaaea9219b30aafdb50cd4c9a00356589d" gracePeriod=600 Oct 04 03:48:35 crc kubenswrapper[4964]: I1004 03:48:35.031242 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="6531ea61e3c01e8aa200a1fc0c735dfaaea9219b30aafdb50cd4c9a00356589d" exitCode=0 Oct 04 03:48:35 crc kubenswrapper[4964]: I1004 03:48:35.031321 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"6531ea61e3c01e8aa200a1fc0c735dfaaea9219b30aafdb50cd4c9a00356589d"} Oct 04 03:48:35 crc kubenswrapper[4964]: I1004 03:48:35.031594 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29"} Oct 04 03:48:35 crc kubenswrapper[4964]: I1004 03:48:35.031633 4964 scope.go:117] "RemoveContainer" containerID="82bd915c15a9011e7bc4ff3152d45f2c7ff245657682171c865dc7e292e06204" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.167816 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qpttl"] Oct 04 03:50:12 crc kubenswrapper[4964]: E1004 03:50:12.168864 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5863d05-1eee-4f45-b2c6-e7eab197fa1e" containerName="collect-profiles" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.168880 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5863d05-1eee-4f45-b2c6-e7eab197fa1e" containerName="collect-profiles" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.169115 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5863d05-1eee-4f45-b2c6-e7eab197fa1e" containerName="collect-profiles" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.171101 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.180852 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qpttl"] Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.342146 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d78db68f-8c9a-45f9-b3d2-640abd1367a7-utilities\") pod \"redhat-marketplace-qpttl\" (UID: \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\") " pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.342291 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zbtm\" (UniqueName: \"kubernetes.io/projected/d78db68f-8c9a-45f9-b3d2-640abd1367a7-kube-api-access-7zbtm\") pod \"redhat-marketplace-qpttl\" (UID: \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\") " pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.342344 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d78db68f-8c9a-45f9-b3d2-640abd1367a7-catalog-content\") pod \"redhat-marketplace-qpttl\" (UID: \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\") " pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.354426 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dzq4d"] Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.358610 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.385560 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzq4d"] Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.443709 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d78db68f-8c9a-45f9-b3d2-640abd1367a7-utilities\") pod \"redhat-marketplace-qpttl\" (UID: \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\") " pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.443803 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zbtm\" (UniqueName: \"kubernetes.io/projected/d78db68f-8c9a-45f9-b3d2-640abd1367a7-kube-api-access-7zbtm\") pod \"redhat-marketplace-qpttl\" (UID: \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\") " pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.443839 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d78db68f-8c9a-45f9-b3d2-640abd1367a7-catalog-content\") pod \"redhat-marketplace-qpttl\" (UID: \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\") " pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.444157 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d78db68f-8c9a-45f9-b3d2-640abd1367a7-utilities\") pod \"redhat-marketplace-qpttl\" (UID: \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\") " pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.444223 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d78db68f-8c9a-45f9-b3d2-640abd1367a7-catalog-content\") pod \"redhat-marketplace-qpttl\" (UID: \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\") " pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.469562 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zbtm\" (UniqueName: \"kubernetes.io/projected/d78db68f-8c9a-45f9-b3d2-640abd1367a7-kube-api-access-7zbtm\") pod \"redhat-marketplace-qpttl\" (UID: \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\") " pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.494094 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.545206 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5622921b-aa3c-4a14-b979-8e8b9900a933-utilities\") pod \"redhat-operators-dzq4d\" (UID: \"5622921b-aa3c-4a14-b979-8e8b9900a933\") " pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.545669 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqjc2\" (UniqueName: \"kubernetes.io/projected/5622921b-aa3c-4a14-b979-8e8b9900a933-kube-api-access-jqjc2\") pod \"redhat-operators-dzq4d\" (UID: \"5622921b-aa3c-4a14-b979-8e8b9900a933\") " pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.545713 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5622921b-aa3c-4a14-b979-8e8b9900a933-catalog-content\") pod \"redhat-operators-dzq4d\" (UID: \"5622921b-aa3c-4a14-b979-8e8b9900a933\") " pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.652448 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqjc2\" (UniqueName: \"kubernetes.io/projected/5622921b-aa3c-4a14-b979-8e8b9900a933-kube-api-access-jqjc2\") pod \"redhat-operators-dzq4d\" (UID: \"5622921b-aa3c-4a14-b979-8e8b9900a933\") " pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.652722 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5622921b-aa3c-4a14-b979-8e8b9900a933-catalog-content\") pod \"redhat-operators-dzq4d\" (UID: \"5622921b-aa3c-4a14-b979-8e8b9900a933\") " pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.652781 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5622921b-aa3c-4a14-b979-8e8b9900a933-utilities\") pod \"redhat-operators-dzq4d\" (UID: \"5622921b-aa3c-4a14-b979-8e8b9900a933\") " pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.653239 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5622921b-aa3c-4a14-b979-8e8b9900a933-utilities\") pod \"redhat-operators-dzq4d\" (UID: \"5622921b-aa3c-4a14-b979-8e8b9900a933\") " pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.654657 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5622921b-aa3c-4a14-b979-8e8b9900a933-catalog-content\") pod \"redhat-operators-dzq4d\" (UID: \"5622921b-aa3c-4a14-b979-8e8b9900a933\") " pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.679559 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqjc2\" (UniqueName: \"kubernetes.io/projected/5622921b-aa3c-4a14-b979-8e8b9900a933-kube-api-access-jqjc2\") pod \"redhat-operators-dzq4d\" (UID: \"5622921b-aa3c-4a14-b979-8e8b9900a933\") " pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.681060 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:12 crc kubenswrapper[4964]: I1004 03:50:12.977186 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qpttl"] Oct 04 03:50:12 crc kubenswrapper[4964]: W1004 03:50:12.982874 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd78db68f_8c9a_45f9_b3d2_640abd1367a7.slice/crio-21dc73eba7edf50fe644db0f08041ea4f348e1d772111cb1f88da9c24966b9bc WatchSource:0}: Error finding container 21dc73eba7edf50fe644db0f08041ea4f348e1d772111cb1f88da9c24966b9bc: Status 404 returned error can't find the container with id 21dc73eba7edf50fe644db0f08041ea4f348e1d772111cb1f88da9c24966b9bc Oct 04 03:50:13 crc kubenswrapper[4964]: I1004 03:50:13.021113 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpttl" event={"ID":"d78db68f-8c9a-45f9-b3d2-640abd1367a7","Type":"ContainerStarted","Data":"21dc73eba7edf50fe644db0f08041ea4f348e1d772111cb1f88da9c24966b9bc"} Oct 04 03:50:13 crc kubenswrapper[4964]: I1004 03:50:13.214499 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzq4d"] Oct 04 03:50:13 crc kubenswrapper[4964]: W1004 03:50:13.245964 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5622921b_aa3c_4a14_b979_8e8b9900a933.slice/crio-acf48f3664d1cb016af607e574dee9a40197a36fccf236fa48fa1f00c2da2e7e WatchSource:0}: Error finding container acf48f3664d1cb016af607e574dee9a40197a36fccf236fa48fa1f00c2da2e7e: Status 404 returned error can't find the container with id acf48f3664d1cb016af607e574dee9a40197a36fccf236fa48fa1f00c2da2e7e Oct 04 03:50:14 crc kubenswrapper[4964]: I1004 03:50:14.033918 4964 generic.go:334] "Generic (PLEG): container finished" podID="d78db68f-8c9a-45f9-b3d2-640abd1367a7" containerID="aa17bd043e60cf7304568e31ae7a506e83127d7546d22d7e7173fdd620b89de5" exitCode=0 Oct 04 03:50:14 crc kubenswrapper[4964]: I1004 03:50:14.034559 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpttl" event={"ID":"d78db68f-8c9a-45f9-b3d2-640abd1367a7","Type":"ContainerDied","Data":"aa17bd043e60cf7304568e31ae7a506e83127d7546d22d7e7173fdd620b89de5"} Oct 04 03:50:14 crc kubenswrapper[4964]: I1004 03:50:14.037261 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 03:50:14 crc kubenswrapper[4964]: I1004 03:50:14.037995 4964 generic.go:334] "Generic (PLEG): container finished" podID="5622921b-aa3c-4a14-b979-8e8b9900a933" containerID="17a77c79d4bdd261c046cbef6e33e3b2100b3ded5444aadb64490a9a2e7d8a9b" exitCode=0 Oct 04 03:50:14 crc kubenswrapper[4964]: I1004 03:50:14.038035 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzq4d" event={"ID":"5622921b-aa3c-4a14-b979-8e8b9900a933","Type":"ContainerDied","Data":"17a77c79d4bdd261c046cbef6e33e3b2100b3ded5444aadb64490a9a2e7d8a9b"} Oct 04 03:50:14 crc kubenswrapper[4964]: I1004 03:50:14.038075 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzq4d" event={"ID":"5622921b-aa3c-4a14-b979-8e8b9900a933","Type":"ContainerStarted","Data":"acf48f3664d1cb016af607e574dee9a40197a36fccf236fa48fa1f00c2da2e7e"} Oct 04 03:50:15 crc kubenswrapper[4964]: I1004 03:50:15.051209 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpttl" event={"ID":"d78db68f-8c9a-45f9-b3d2-640abd1367a7","Type":"ContainerStarted","Data":"1cb998dac3fa00ea3c0c8314085080ac291c72265d57f64eeecef7dd3b668294"} Oct 04 03:50:16 crc kubenswrapper[4964]: I1004 03:50:16.063638 4964 generic.go:334] "Generic (PLEG): container finished" podID="d78db68f-8c9a-45f9-b3d2-640abd1367a7" containerID="1cb998dac3fa00ea3c0c8314085080ac291c72265d57f64eeecef7dd3b668294" exitCode=0 Oct 04 03:50:16 crc kubenswrapper[4964]: I1004 03:50:16.064221 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpttl" event={"ID":"d78db68f-8c9a-45f9-b3d2-640abd1367a7","Type":"ContainerDied","Data":"1cb998dac3fa00ea3c0c8314085080ac291c72265d57f64eeecef7dd3b668294"} Oct 04 03:50:16 crc kubenswrapper[4964]: I1004 03:50:16.068026 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzq4d" event={"ID":"5622921b-aa3c-4a14-b979-8e8b9900a933","Type":"ContainerStarted","Data":"d2d84a2080f5090a13354e53a965ee27f75d2e70e4cc3e78f0e229c87d27c875"} Oct 04 03:50:18 crc kubenswrapper[4964]: I1004 03:50:18.090831 4964 generic.go:334] "Generic (PLEG): container finished" podID="5622921b-aa3c-4a14-b979-8e8b9900a933" containerID="d2d84a2080f5090a13354e53a965ee27f75d2e70e4cc3e78f0e229c87d27c875" exitCode=0 Oct 04 03:50:18 crc kubenswrapper[4964]: I1004 03:50:18.091064 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzq4d" event={"ID":"5622921b-aa3c-4a14-b979-8e8b9900a933","Type":"ContainerDied","Data":"d2d84a2080f5090a13354e53a965ee27f75d2e70e4cc3e78f0e229c87d27c875"} Oct 04 03:50:18 crc kubenswrapper[4964]: I1004 03:50:18.095049 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpttl" event={"ID":"d78db68f-8c9a-45f9-b3d2-640abd1367a7","Type":"ContainerStarted","Data":"e6411feb46cf459b2038facf8662c78f813f8d71eafa980967b970d54f0e1c1a"} Oct 04 03:50:19 crc kubenswrapper[4964]: I1004 03:50:19.110748 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzq4d" event={"ID":"5622921b-aa3c-4a14-b979-8e8b9900a933","Type":"ContainerStarted","Data":"ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7"} Oct 04 03:50:19 crc kubenswrapper[4964]: I1004 03:50:19.140951 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qpttl" podStartSLOduration=4.660428764 podStartE2EDuration="7.140920422s" podCreationTimestamp="2025-10-04 03:50:12 +0000 UTC" firstStartedPulling="2025-10-04 03:50:14.036576214 +0000 UTC m=+4193.933534882" lastFinishedPulling="2025-10-04 03:50:16.517067892 +0000 UTC m=+4196.414026540" observedRunningTime="2025-10-04 03:50:18.140780318 +0000 UTC m=+4198.037738956" watchObservedRunningTime="2025-10-04 03:50:19.140920422 +0000 UTC m=+4199.037879100" Oct 04 03:50:19 crc kubenswrapper[4964]: I1004 03:50:19.151877 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dzq4d" podStartSLOduration=2.504180908 podStartE2EDuration="7.151854253s" podCreationTimestamp="2025-10-04 03:50:12 +0000 UTC" firstStartedPulling="2025-10-04 03:50:14.040128238 +0000 UTC m=+4193.937086916" lastFinishedPulling="2025-10-04 03:50:18.687801613 +0000 UTC m=+4198.584760261" observedRunningTime="2025-10-04 03:50:19.136792582 +0000 UTC m=+4199.033751260" watchObservedRunningTime="2025-10-04 03:50:19.151854253 +0000 UTC m=+4199.048812921" Oct 04 03:50:22 crc kubenswrapper[4964]: I1004 03:50:22.500296 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:22 crc kubenswrapper[4964]: I1004 03:50:22.500931 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:22 crc kubenswrapper[4964]: I1004 03:50:22.608641 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:22 crc kubenswrapper[4964]: I1004 03:50:22.681924 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:22 crc kubenswrapper[4964]: I1004 03:50:22.682248 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:23 crc kubenswrapper[4964]: I1004 03:50:23.223700 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:23 crc kubenswrapper[4964]: I1004 03:50:23.725273 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dzq4d" podUID="5622921b-aa3c-4a14-b979-8e8b9900a933" containerName="registry-server" probeResult="failure" output=< Oct 04 03:50:23 crc kubenswrapper[4964]: timeout: failed to connect service ":50051" within 1s Oct 04 03:50:23 crc kubenswrapper[4964]: > Oct 04 03:50:23 crc kubenswrapper[4964]: I1004 03:50:23.947816 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qpttl"] Oct 04 03:50:25 crc kubenswrapper[4964]: I1004 03:50:25.177693 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qpttl" podUID="d78db68f-8c9a-45f9-b3d2-640abd1367a7" containerName="registry-server" containerID="cri-o://e6411feb46cf459b2038facf8662c78f813f8d71eafa980967b970d54f0e1c1a" gracePeriod=2 Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.189428 4964 generic.go:334] "Generic (PLEG): container finished" podID="d78db68f-8c9a-45f9-b3d2-640abd1367a7" containerID="e6411feb46cf459b2038facf8662c78f813f8d71eafa980967b970d54f0e1c1a" exitCode=0 Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.189487 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpttl" event={"ID":"d78db68f-8c9a-45f9-b3d2-640abd1367a7","Type":"ContainerDied","Data":"e6411feb46cf459b2038facf8662c78f813f8d71eafa980967b970d54f0e1c1a"} Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.190157 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpttl" event={"ID":"d78db68f-8c9a-45f9-b3d2-640abd1367a7","Type":"ContainerDied","Data":"21dc73eba7edf50fe644db0f08041ea4f348e1d772111cb1f88da9c24966b9bc"} Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.190177 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21dc73eba7edf50fe644db0f08041ea4f348e1d772111cb1f88da9c24966b9bc" Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.203271 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.349724 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d78db68f-8c9a-45f9-b3d2-640abd1367a7-utilities\") pod \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\" (UID: \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\") " Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.349973 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d78db68f-8c9a-45f9-b3d2-640abd1367a7-catalog-content\") pod \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\" (UID: \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\") " Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.350127 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zbtm\" (UniqueName: \"kubernetes.io/projected/d78db68f-8c9a-45f9-b3d2-640abd1367a7-kube-api-access-7zbtm\") pod \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\" (UID: \"d78db68f-8c9a-45f9-b3d2-640abd1367a7\") " Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.350388 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d78db68f-8c9a-45f9-b3d2-640abd1367a7-utilities" (OuterVolumeSpecName: "utilities") pod "d78db68f-8c9a-45f9-b3d2-640abd1367a7" (UID: "d78db68f-8c9a-45f9-b3d2-640abd1367a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.350712 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d78db68f-8c9a-45f9-b3d2-640abd1367a7-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.360406 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d78db68f-8c9a-45f9-b3d2-640abd1367a7-kube-api-access-7zbtm" (OuterVolumeSpecName: "kube-api-access-7zbtm") pod "d78db68f-8c9a-45f9-b3d2-640abd1367a7" (UID: "d78db68f-8c9a-45f9-b3d2-640abd1367a7"). InnerVolumeSpecName "kube-api-access-7zbtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.362241 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d78db68f-8c9a-45f9-b3d2-640abd1367a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d78db68f-8c9a-45f9-b3d2-640abd1367a7" (UID: "d78db68f-8c9a-45f9-b3d2-640abd1367a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.452398 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d78db68f-8c9a-45f9-b3d2-640abd1367a7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:50:26 crc kubenswrapper[4964]: I1004 03:50:26.452771 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zbtm\" (UniqueName: \"kubernetes.io/projected/d78db68f-8c9a-45f9-b3d2-640abd1367a7-kube-api-access-7zbtm\") on node \"crc\" DevicePath \"\"" Oct 04 03:50:27 crc kubenswrapper[4964]: I1004 03:50:27.202797 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qpttl" Oct 04 03:50:27 crc kubenswrapper[4964]: I1004 03:50:27.226299 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qpttl"] Oct 04 03:50:27 crc kubenswrapper[4964]: I1004 03:50:27.242765 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qpttl"] Oct 04 03:50:28 crc kubenswrapper[4964]: I1004 03:50:28.868223 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d78db68f-8c9a-45f9-b3d2-640abd1367a7" path="/var/lib/kubelet/pods/d78db68f-8c9a-45f9-b3d2-640abd1367a7/volumes" Oct 04 03:50:32 crc kubenswrapper[4964]: I1004 03:50:32.752122 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:32 crc kubenswrapper[4964]: I1004 03:50:32.856105 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:33 crc kubenswrapper[4964]: I1004 03:50:33.003318 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzq4d"] Oct 04 03:50:34 crc kubenswrapper[4964]: I1004 03:50:34.294869 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dzq4d" podUID="5622921b-aa3c-4a14-b979-8e8b9900a933" containerName="registry-server" containerID="cri-o://ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7" gracePeriod=2 Oct 04 03:50:34 crc kubenswrapper[4964]: I1004 03:50:34.449306 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:50:34 crc kubenswrapper[4964]: I1004 03:50:34.449728 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:50:34 crc kubenswrapper[4964]: I1004 03:50:34.912444 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.037923 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqjc2\" (UniqueName: \"kubernetes.io/projected/5622921b-aa3c-4a14-b979-8e8b9900a933-kube-api-access-jqjc2\") pod \"5622921b-aa3c-4a14-b979-8e8b9900a933\" (UID: \"5622921b-aa3c-4a14-b979-8e8b9900a933\") " Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.038111 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5622921b-aa3c-4a14-b979-8e8b9900a933-utilities\") pod \"5622921b-aa3c-4a14-b979-8e8b9900a933\" (UID: \"5622921b-aa3c-4a14-b979-8e8b9900a933\") " Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.038160 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5622921b-aa3c-4a14-b979-8e8b9900a933-catalog-content\") pod \"5622921b-aa3c-4a14-b979-8e8b9900a933\" (UID: \"5622921b-aa3c-4a14-b979-8e8b9900a933\") " Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.039365 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5622921b-aa3c-4a14-b979-8e8b9900a933-utilities" (OuterVolumeSpecName: "utilities") pod "5622921b-aa3c-4a14-b979-8e8b9900a933" (UID: "5622921b-aa3c-4a14-b979-8e8b9900a933"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.044520 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5622921b-aa3c-4a14-b979-8e8b9900a933-kube-api-access-jqjc2" (OuterVolumeSpecName: "kube-api-access-jqjc2") pod "5622921b-aa3c-4a14-b979-8e8b9900a933" (UID: "5622921b-aa3c-4a14-b979-8e8b9900a933"). InnerVolumeSpecName "kube-api-access-jqjc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.122443 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5622921b-aa3c-4a14-b979-8e8b9900a933-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5622921b-aa3c-4a14-b979-8e8b9900a933" (UID: "5622921b-aa3c-4a14-b979-8e8b9900a933"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.140374 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqjc2\" (UniqueName: \"kubernetes.io/projected/5622921b-aa3c-4a14-b979-8e8b9900a933-kube-api-access-jqjc2\") on node \"crc\" DevicePath \"\"" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.140410 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5622921b-aa3c-4a14-b979-8e8b9900a933-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.140420 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5622921b-aa3c-4a14-b979-8e8b9900a933-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.306026 4964 generic.go:334] "Generic (PLEG): container finished" podID="5622921b-aa3c-4a14-b979-8e8b9900a933" containerID="ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7" exitCode=0 Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.306084 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzq4d" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.306091 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzq4d" event={"ID":"5622921b-aa3c-4a14-b979-8e8b9900a933","Type":"ContainerDied","Data":"ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7"} Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.306215 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzq4d" event={"ID":"5622921b-aa3c-4a14-b979-8e8b9900a933","Type":"ContainerDied","Data":"acf48f3664d1cb016af607e574dee9a40197a36fccf236fa48fa1f00c2da2e7e"} Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.306238 4964 scope.go:117] "RemoveContainer" containerID="ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.347908 4964 scope.go:117] "RemoveContainer" containerID="d2d84a2080f5090a13354e53a965ee27f75d2e70e4cc3e78f0e229c87d27c875" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.358438 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzq4d"] Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.373430 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dzq4d"] Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.379171 4964 scope.go:117] "RemoveContainer" containerID="17a77c79d4bdd261c046cbef6e33e3b2100b3ded5444aadb64490a9a2e7d8a9b" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.436741 4964 scope.go:117] "RemoveContainer" containerID="ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7" Oct 04 03:50:35 crc kubenswrapper[4964]: E1004 03:50:35.437258 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7\": container with ID starting with ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7 not found: ID does not exist" containerID="ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.437311 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7"} err="failed to get container status \"ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7\": rpc error: code = NotFound desc = could not find container \"ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7\": container with ID starting with ab0e5d52c2fc9fe83b82da2dd12fb27857e872722c202b6d9db70842acc186e7 not found: ID does not exist" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.437342 4964 scope.go:117] "RemoveContainer" containerID="d2d84a2080f5090a13354e53a965ee27f75d2e70e4cc3e78f0e229c87d27c875" Oct 04 03:50:35 crc kubenswrapper[4964]: E1004 03:50:35.437764 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d84a2080f5090a13354e53a965ee27f75d2e70e4cc3e78f0e229c87d27c875\": container with ID starting with d2d84a2080f5090a13354e53a965ee27f75d2e70e4cc3e78f0e229c87d27c875 not found: ID does not exist" containerID="d2d84a2080f5090a13354e53a965ee27f75d2e70e4cc3e78f0e229c87d27c875" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.437805 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d84a2080f5090a13354e53a965ee27f75d2e70e4cc3e78f0e229c87d27c875"} err="failed to get container status \"d2d84a2080f5090a13354e53a965ee27f75d2e70e4cc3e78f0e229c87d27c875\": rpc error: code = NotFound desc = could not find container \"d2d84a2080f5090a13354e53a965ee27f75d2e70e4cc3e78f0e229c87d27c875\": container with ID starting with d2d84a2080f5090a13354e53a965ee27f75d2e70e4cc3e78f0e229c87d27c875 not found: ID does not exist" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.437832 4964 scope.go:117] "RemoveContainer" containerID="17a77c79d4bdd261c046cbef6e33e3b2100b3ded5444aadb64490a9a2e7d8a9b" Oct 04 03:50:35 crc kubenswrapper[4964]: E1004 03:50:35.438118 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a77c79d4bdd261c046cbef6e33e3b2100b3ded5444aadb64490a9a2e7d8a9b\": container with ID starting with 17a77c79d4bdd261c046cbef6e33e3b2100b3ded5444aadb64490a9a2e7d8a9b not found: ID does not exist" containerID="17a77c79d4bdd261c046cbef6e33e3b2100b3ded5444aadb64490a9a2e7d8a9b" Oct 04 03:50:35 crc kubenswrapper[4964]: I1004 03:50:35.438149 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a77c79d4bdd261c046cbef6e33e3b2100b3ded5444aadb64490a9a2e7d8a9b"} err="failed to get container status \"17a77c79d4bdd261c046cbef6e33e3b2100b3ded5444aadb64490a9a2e7d8a9b\": rpc error: code = NotFound desc = could not find container \"17a77c79d4bdd261c046cbef6e33e3b2100b3ded5444aadb64490a9a2e7d8a9b\": container with ID starting with 17a77c79d4bdd261c046cbef6e33e3b2100b3ded5444aadb64490a9a2e7d8a9b not found: ID does not exist" Oct 04 03:50:36 crc kubenswrapper[4964]: I1004 03:50:36.854168 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5622921b-aa3c-4a14-b979-8e8b9900a933" path="/var/lib/kubelet/pods/5622921b-aa3c-4a14-b979-8e8b9900a933/volumes" Oct 04 03:51:04 crc kubenswrapper[4964]: I1004 03:51:04.449439 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:51:04 crc kubenswrapper[4964]: I1004 03:51:04.449955 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:51:34 crc kubenswrapper[4964]: I1004 03:51:34.448855 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:51:34 crc kubenswrapper[4964]: I1004 03:51:34.449359 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:51:34 crc kubenswrapper[4964]: I1004 03:51:34.449419 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 03:51:34 crc kubenswrapper[4964]: I1004 03:51:34.450405 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 03:51:34 crc kubenswrapper[4964]: I1004 03:51:34.450497 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" gracePeriod=600 Oct 04 03:51:34 crc kubenswrapper[4964]: E1004 03:51:34.577713 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:51:34 crc kubenswrapper[4964]: I1004 03:51:34.877530 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" exitCode=0 Oct 04 03:51:34 crc kubenswrapper[4964]: I1004 03:51:34.877568 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29"} Oct 04 03:51:34 crc kubenswrapper[4964]: I1004 03:51:34.877597 4964 scope.go:117] "RemoveContainer" containerID="6531ea61e3c01e8aa200a1fc0c735dfaaea9219b30aafdb50cd4c9a00356589d" Oct 04 03:51:34 crc kubenswrapper[4964]: I1004 03:51:34.878407 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:51:34 crc kubenswrapper[4964]: E1004 03:51:34.878786 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:51:47 crc kubenswrapper[4964]: I1004 03:51:47.846063 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:51:47 crc kubenswrapper[4964]: E1004 03:51:47.847070 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:52:00 crc kubenswrapper[4964]: I1004 03:52:00.867765 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:52:00 crc kubenswrapper[4964]: E1004 03:52:00.869512 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:52:11 crc kubenswrapper[4964]: I1004 03:52:11.845707 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:52:11 crc kubenswrapper[4964]: E1004 03:52:11.846702 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:52:23 crc kubenswrapper[4964]: I1004 03:52:23.845951 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:52:23 crc kubenswrapper[4964]: E1004 03:52:23.847186 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:52:34 crc kubenswrapper[4964]: I1004 03:52:34.845279 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:52:34 crc kubenswrapper[4964]: E1004 03:52:34.845908 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:52:49 crc kubenswrapper[4964]: I1004 03:52:49.845318 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:52:49 crc kubenswrapper[4964]: E1004 03:52:49.846020 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:53:03 crc kubenswrapper[4964]: I1004 03:53:03.845519 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:53:03 crc kubenswrapper[4964]: E1004 03:53:03.846437 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:53:14 crc kubenswrapper[4964]: I1004 03:53:14.846864 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:53:14 crc kubenswrapper[4964]: E1004 03:53:14.847892 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:53:26 crc kubenswrapper[4964]: I1004 03:53:26.846139 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:53:26 crc kubenswrapper[4964]: E1004 03:53:26.847715 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:53:40 crc kubenswrapper[4964]: I1004 03:53:40.870356 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:53:40 crc kubenswrapper[4964]: E1004 03:53:40.871467 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.356831 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sg5gx"] Oct 04 03:53:44 crc kubenswrapper[4964]: E1004 03:53:44.358317 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5622921b-aa3c-4a14-b979-8e8b9900a933" containerName="registry-server" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.358343 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5622921b-aa3c-4a14-b979-8e8b9900a933" containerName="registry-server" Oct 04 03:53:44 crc kubenswrapper[4964]: E1004 03:53:44.358385 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d78db68f-8c9a-45f9-b3d2-640abd1367a7" containerName="extract-utilities" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.358399 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="d78db68f-8c9a-45f9-b3d2-640abd1367a7" containerName="extract-utilities" Oct 04 03:53:44 crc kubenswrapper[4964]: E1004 03:53:44.358429 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5622921b-aa3c-4a14-b979-8e8b9900a933" containerName="extract-content" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.358444 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5622921b-aa3c-4a14-b979-8e8b9900a933" containerName="extract-content" Oct 04 03:53:44 crc kubenswrapper[4964]: E1004 03:53:44.358461 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5622921b-aa3c-4a14-b979-8e8b9900a933" containerName="extract-utilities" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.358473 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="5622921b-aa3c-4a14-b979-8e8b9900a933" containerName="extract-utilities" Oct 04 03:53:44 crc kubenswrapper[4964]: E1004 03:53:44.358505 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d78db68f-8c9a-45f9-b3d2-640abd1367a7" containerName="extract-content" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.358518 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="d78db68f-8c9a-45f9-b3d2-640abd1367a7" containerName="extract-content" Oct 04 03:53:44 crc kubenswrapper[4964]: E1004 03:53:44.358543 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d78db68f-8c9a-45f9-b3d2-640abd1367a7" containerName="registry-server" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.358554 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="d78db68f-8c9a-45f9-b3d2-640abd1367a7" containerName="registry-server" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.358912 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="d78db68f-8c9a-45f9-b3d2-640abd1367a7" containerName="registry-server" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.358951 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="5622921b-aa3c-4a14-b979-8e8b9900a933" containerName="registry-server" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.361324 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.374652 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sg5gx"] Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.441307 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8463e9a-2c03-4cd9-9617-998e597f6497-catalog-content\") pod \"community-operators-sg5gx\" (UID: \"a8463e9a-2c03-4cd9-9617-998e597f6497\") " pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.441476 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8463e9a-2c03-4cd9-9617-998e597f6497-utilities\") pod \"community-operators-sg5gx\" (UID: \"a8463e9a-2c03-4cd9-9617-998e597f6497\") " pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.441528 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dq46\" (UniqueName: \"kubernetes.io/projected/a8463e9a-2c03-4cd9-9617-998e597f6497-kube-api-access-4dq46\") pod \"community-operators-sg5gx\" (UID: \"a8463e9a-2c03-4cd9-9617-998e597f6497\") " pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.543422 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8463e9a-2c03-4cd9-9617-998e597f6497-catalog-content\") pod \"community-operators-sg5gx\" (UID: \"a8463e9a-2c03-4cd9-9617-998e597f6497\") " pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.543691 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8463e9a-2c03-4cd9-9617-998e597f6497-utilities\") pod \"community-operators-sg5gx\" (UID: \"a8463e9a-2c03-4cd9-9617-998e597f6497\") " pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.543766 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dq46\" (UniqueName: \"kubernetes.io/projected/a8463e9a-2c03-4cd9-9617-998e597f6497-kube-api-access-4dq46\") pod \"community-operators-sg5gx\" (UID: \"a8463e9a-2c03-4cd9-9617-998e597f6497\") " pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.544662 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8463e9a-2c03-4cd9-9617-998e597f6497-catalog-content\") pod \"community-operators-sg5gx\" (UID: \"a8463e9a-2c03-4cd9-9617-998e597f6497\") " pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.544860 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8463e9a-2c03-4cd9-9617-998e597f6497-utilities\") pod \"community-operators-sg5gx\" (UID: \"a8463e9a-2c03-4cd9-9617-998e597f6497\") " pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.569503 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-smp4d"] Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.572308 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.577878 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dq46\" (UniqueName: \"kubernetes.io/projected/a8463e9a-2c03-4cd9-9617-998e597f6497-kube-api-access-4dq46\") pod \"community-operators-sg5gx\" (UID: \"a8463e9a-2c03-4cd9-9617-998e597f6497\") " pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.594297 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-smp4d"] Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.647712 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6205b916-e513-4480-9774-1076d139dafb-catalog-content\") pod \"certified-operators-smp4d\" (UID: \"6205b916-e513-4480-9774-1076d139dafb\") " pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.647991 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6205b916-e513-4480-9774-1076d139dafb-utilities\") pod \"certified-operators-smp4d\" (UID: \"6205b916-e513-4480-9774-1076d139dafb\") " pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.648131 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8qwn\" (UniqueName: \"kubernetes.io/projected/6205b916-e513-4480-9774-1076d139dafb-kube-api-access-d8qwn\") pod \"certified-operators-smp4d\" (UID: \"6205b916-e513-4480-9774-1076d139dafb\") " pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.694937 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.749541 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6205b916-e513-4480-9774-1076d139dafb-catalog-content\") pod \"certified-operators-smp4d\" (UID: \"6205b916-e513-4480-9774-1076d139dafb\") " pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.749786 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6205b916-e513-4480-9774-1076d139dafb-utilities\") pod \"certified-operators-smp4d\" (UID: \"6205b916-e513-4480-9774-1076d139dafb\") " pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.749849 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8qwn\" (UniqueName: \"kubernetes.io/projected/6205b916-e513-4480-9774-1076d139dafb-kube-api-access-d8qwn\") pod \"certified-operators-smp4d\" (UID: \"6205b916-e513-4480-9774-1076d139dafb\") " pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.750125 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6205b916-e513-4480-9774-1076d139dafb-catalog-content\") pod \"certified-operators-smp4d\" (UID: \"6205b916-e513-4480-9774-1076d139dafb\") " pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.750796 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6205b916-e513-4480-9774-1076d139dafb-utilities\") pod \"certified-operators-smp4d\" (UID: \"6205b916-e513-4480-9774-1076d139dafb\") " pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.775421 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8qwn\" (UniqueName: \"kubernetes.io/projected/6205b916-e513-4480-9774-1076d139dafb-kube-api-access-d8qwn\") pod \"certified-operators-smp4d\" (UID: \"6205b916-e513-4480-9774-1076d139dafb\") " pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:44 crc kubenswrapper[4964]: I1004 03:53:44.942871 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:45 crc kubenswrapper[4964]: I1004 03:53:45.260429 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sg5gx"] Oct 04 03:53:45 crc kubenswrapper[4964]: I1004 03:53:45.825909 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-smp4d"] Oct 04 03:53:45 crc kubenswrapper[4964]: W1004 03:53:45.830118 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6205b916_e513_4480_9774_1076d139dafb.slice/crio-2f88d1336c3c3bf3556e3992b9e8d051738f92d8edb0f62c7c1f5af3bfa89ddc WatchSource:0}: Error finding container 2f88d1336c3c3bf3556e3992b9e8d051738f92d8edb0f62c7c1f5af3bfa89ddc: Status 404 returned error can't find the container with id 2f88d1336c3c3bf3556e3992b9e8d051738f92d8edb0f62c7c1f5af3bfa89ddc Oct 04 03:53:46 crc kubenswrapper[4964]: I1004 03:53:46.288174 4964 generic.go:334] "Generic (PLEG): container finished" podID="6205b916-e513-4480-9774-1076d139dafb" containerID="f82b1495a560c3489bc0ec88689606cc1dfcdf196c4e158e6b05009f68aab533" exitCode=0 Oct 04 03:53:46 crc kubenswrapper[4964]: I1004 03:53:46.288491 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smp4d" event={"ID":"6205b916-e513-4480-9774-1076d139dafb","Type":"ContainerDied","Data":"f82b1495a560c3489bc0ec88689606cc1dfcdf196c4e158e6b05009f68aab533"} Oct 04 03:53:46 crc kubenswrapper[4964]: I1004 03:53:46.288523 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smp4d" event={"ID":"6205b916-e513-4480-9774-1076d139dafb","Type":"ContainerStarted","Data":"2f88d1336c3c3bf3556e3992b9e8d051738f92d8edb0f62c7c1f5af3bfa89ddc"} Oct 04 03:53:46 crc kubenswrapper[4964]: I1004 03:53:46.291825 4964 generic.go:334] "Generic (PLEG): container finished" podID="a8463e9a-2c03-4cd9-9617-998e597f6497" containerID="725e04c44ede104846ca1e88340d603fb8ed76709ea08ddbf4acb95e5d07e4ea" exitCode=0 Oct 04 03:53:46 crc kubenswrapper[4964]: I1004 03:53:46.291875 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sg5gx" event={"ID":"a8463e9a-2c03-4cd9-9617-998e597f6497","Type":"ContainerDied","Data":"725e04c44ede104846ca1e88340d603fb8ed76709ea08ddbf4acb95e5d07e4ea"} Oct 04 03:53:46 crc kubenswrapper[4964]: I1004 03:53:46.291905 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sg5gx" event={"ID":"a8463e9a-2c03-4cd9-9617-998e597f6497","Type":"ContainerStarted","Data":"990f391061778c901c2a09f8820a896366057716de66cb7d866db41cf54bc9d0"} Oct 04 03:53:47 crc kubenswrapper[4964]: I1004 03:53:47.301789 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sg5gx" event={"ID":"a8463e9a-2c03-4cd9-9617-998e597f6497","Type":"ContainerStarted","Data":"6e41b86a25ca6a9093b6df30cab5af878f7b886efe92473b1dc7c40f9927e882"} Oct 04 03:53:47 crc kubenswrapper[4964]: I1004 03:53:47.303976 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smp4d" event={"ID":"6205b916-e513-4480-9774-1076d139dafb","Type":"ContainerStarted","Data":"de82a9dc9ebbe8d6ef856a64275b8d52ee6a865f849b50a237d1b8440c17478a"} Oct 04 03:53:49 crc kubenswrapper[4964]: I1004 03:53:49.335708 4964 generic.go:334] "Generic (PLEG): container finished" podID="a8463e9a-2c03-4cd9-9617-998e597f6497" containerID="6e41b86a25ca6a9093b6df30cab5af878f7b886efe92473b1dc7c40f9927e882" exitCode=0 Oct 04 03:53:49 crc kubenswrapper[4964]: I1004 03:53:49.336665 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sg5gx" event={"ID":"a8463e9a-2c03-4cd9-9617-998e597f6497","Type":"ContainerDied","Data":"6e41b86a25ca6a9093b6df30cab5af878f7b886efe92473b1dc7c40f9927e882"} Oct 04 03:53:49 crc kubenswrapper[4964]: I1004 03:53:49.346068 4964 generic.go:334] "Generic (PLEG): container finished" podID="6205b916-e513-4480-9774-1076d139dafb" containerID="de82a9dc9ebbe8d6ef856a64275b8d52ee6a865f849b50a237d1b8440c17478a" exitCode=0 Oct 04 03:53:49 crc kubenswrapper[4964]: I1004 03:53:49.346228 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smp4d" event={"ID":"6205b916-e513-4480-9774-1076d139dafb","Type":"ContainerDied","Data":"de82a9dc9ebbe8d6ef856a64275b8d52ee6a865f849b50a237d1b8440c17478a"} Oct 04 03:53:50 crc kubenswrapper[4964]: I1004 03:53:50.368072 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sg5gx" event={"ID":"a8463e9a-2c03-4cd9-9617-998e597f6497","Type":"ContainerStarted","Data":"4a841147971f4f242c05f303185fc951afa3aaf1b2c42cba4c5134b2a7387d0b"} Oct 04 03:53:50 crc kubenswrapper[4964]: I1004 03:53:50.380465 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smp4d" event={"ID":"6205b916-e513-4480-9774-1076d139dafb","Type":"ContainerStarted","Data":"e569465e61541c01c93ab4b2d6e2d5af6bf9f2053b0fd3a86c8769caaef3a28e"} Oct 04 03:53:50 crc kubenswrapper[4964]: I1004 03:53:50.392746 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sg5gx" podStartSLOduration=2.896743194 podStartE2EDuration="6.392727645s" podCreationTimestamp="2025-10-04 03:53:44 +0000 UTC" firstStartedPulling="2025-10-04 03:53:46.293482833 +0000 UTC m=+4406.190441501" lastFinishedPulling="2025-10-04 03:53:49.789467304 +0000 UTC m=+4409.686425952" observedRunningTime="2025-10-04 03:53:50.390117466 +0000 UTC m=+4410.287076104" watchObservedRunningTime="2025-10-04 03:53:50.392727645 +0000 UTC m=+4410.289686283" Oct 04 03:53:53 crc kubenswrapper[4964]: I1004 03:53:53.846022 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:53:53 crc kubenswrapper[4964]: E1004 03:53:53.846679 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:53:54 crc kubenswrapper[4964]: I1004 03:53:54.695200 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:54 crc kubenswrapper[4964]: I1004 03:53:54.695417 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:54 crc kubenswrapper[4964]: I1004 03:53:54.787734 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:54 crc kubenswrapper[4964]: I1004 03:53:54.833853 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-smp4d" podStartSLOduration=7.357924761 podStartE2EDuration="10.833819096s" podCreationTimestamp="2025-10-04 03:53:44 +0000 UTC" firstStartedPulling="2025-10-04 03:53:46.291205914 +0000 UTC m=+4406.188164582" lastFinishedPulling="2025-10-04 03:53:49.767100269 +0000 UTC m=+4409.664058917" observedRunningTime="2025-10-04 03:53:50.41206876 +0000 UTC m=+4410.309027418" watchObservedRunningTime="2025-10-04 03:53:54.833819096 +0000 UTC m=+4414.730777774" Oct 04 03:53:54 crc kubenswrapper[4964]: I1004 03:53:54.943910 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:54 crc kubenswrapper[4964]: I1004 03:53:54.943961 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:55 crc kubenswrapper[4964]: I1004 03:53:55.030122 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:55 crc kubenswrapper[4964]: I1004 03:53:55.750657 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:55 crc kubenswrapper[4964]: I1004 03:53:55.756766 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:57 crc kubenswrapper[4964]: I1004 03:53:57.147224 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sg5gx"] Oct 04 03:53:57 crc kubenswrapper[4964]: I1004 03:53:57.456197 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sg5gx" podUID="a8463e9a-2c03-4cd9-9617-998e597f6497" containerName="registry-server" containerID="cri-o://4a841147971f4f242c05f303185fc951afa3aaf1b2c42cba4c5134b2a7387d0b" gracePeriod=2 Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.151538 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-smp4d"] Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.152588 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-smp4d" podUID="6205b916-e513-4480-9774-1076d139dafb" containerName="registry-server" containerID="cri-o://e569465e61541c01c93ab4b2d6e2d5af6bf9f2053b0fd3a86c8769caaef3a28e" gracePeriod=2 Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.479101 4964 generic.go:334] "Generic (PLEG): container finished" podID="6205b916-e513-4480-9774-1076d139dafb" containerID="e569465e61541c01c93ab4b2d6e2d5af6bf9f2053b0fd3a86c8769caaef3a28e" exitCode=0 Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.479206 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smp4d" event={"ID":"6205b916-e513-4480-9774-1076d139dafb","Type":"ContainerDied","Data":"e569465e61541c01c93ab4b2d6e2d5af6bf9f2053b0fd3a86c8769caaef3a28e"} Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.483498 4964 generic.go:334] "Generic (PLEG): container finished" podID="a8463e9a-2c03-4cd9-9617-998e597f6497" containerID="4a841147971f4f242c05f303185fc951afa3aaf1b2c42cba4c5134b2a7387d0b" exitCode=0 Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.483637 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sg5gx" event={"ID":"a8463e9a-2c03-4cd9-9617-998e597f6497","Type":"ContainerDied","Data":"4a841147971f4f242c05f303185fc951afa3aaf1b2c42cba4c5134b2a7387d0b"} Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.787071 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.874910 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8463e9a-2c03-4cd9-9617-998e597f6497-catalog-content\") pod \"a8463e9a-2c03-4cd9-9617-998e597f6497\" (UID: \"a8463e9a-2c03-4cd9-9617-998e597f6497\") " Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.875364 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8463e9a-2c03-4cd9-9617-998e597f6497-utilities\") pod \"a8463e9a-2c03-4cd9-9617-998e597f6497\" (UID: \"a8463e9a-2c03-4cd9-9617-998e597f6497\") " Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.875408 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dq46\" (UniqueName: \"kubernetes.io/projected/a8463e9a-2c03-4cd9-9617-998e597f6497-kube-api-access-4dq46\") pod \"a8463e9a-2c03-4cd9-9617-998e597f6497\" (UID: \"a8463e9a-2c03-4cd9-9617-998e597f6497\") " Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.876361 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8463e9a-2c03-4cd9-9617-998e597f6497-utilities" (OuterVolumeSpecName: "utilities") pod "a8463e9a-2c03-4cd9-9617-998e597f6497" (UID: "a8463e9a-2c03-4cd9-9617-998e597f6497"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.885164 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8463e9a-2c03-4cd9-9617-998e597f6497-kube-api-access-4dq46" (OuterVolumeSpecName: "kube-api-access-4dq46") pod "a8463e9a-2c03-4cd9-9617-998e597f6497" (UID: "a8463e9a-2c03-4cd9-9617-998e597f6497"). InnerVolumeSpecName "kube-api-access-4dq46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.929842 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8463e9a-2c03-4cd9-9617-998e597f6497-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8463e9a-2c03-4cd9-9617-998e597f6497" (UID: "a8463e9a-2c03-4cd9-9617-998e597f6497"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.978325 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8463e9a-2c03-4cd9-9617-998e597f6497-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.978376 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dq46\" (UniqueName: \"kubernetes.io/projected/a8463e9a-2c03-4cd9-9617-998e597f6497-kube-api-access-4dq46\") on node \"crc\" DevicePath \"\"" Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.978399 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8463e9a-2c03-4cd9-9617-998e597f6497-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:53:58 crc kubenswrapper[4964]: I1004 03:53:58.983557 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.079397 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6205b916-e513-4480-9774-1076d139dafb-utilities\") pod \"6205b916-e513-4480-9774-1076d139dafb\" (UID: \"6205b916-e513-4480-9774-1076d139dafb\") " Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.079879 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6205b916-e513-4480-9774-1076d139dafb-utilities" (OuterVolumeSpecName: "utilities") pod "6205b916-e513-4480-9774-1076d139dafb" (UID: "6205b916-e513-4480-9774-1076d139dafb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.080324 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8qwn\" (UniqueName: \"kubernetes.io/projected/6205b916-e513-4480-9774-1076d139dafb-kube-api-access-d8qwn\") pod \"6205b916-e513-4480-9774-1076d139dafb\" (UID: \"6205b916-e513-4480-9774-1076d139dafb\") " Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.080684 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6205b916-e513-4480-9774-1076d139dafb-catalog-content\") pod \"6205b916-e513-4480-9774-1076d139dafb\" (UID: \"6205b916-e513-4480-9774-1076d139dafb\") " Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.081448 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6205b916-e513-4480-9774-1076d139dafb-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.084369 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6205b916-e513-4480-9774-1076d139dafb-kube-api-access-d8qwn" (OuterVolumeSpecName: "kube-api-access-d8qwn") pod "6205b916-e513-4480-9774-1076d139dafb" (UID: "6205b916-e513-4480-9774-1076d139dafb"). InnerVolumeSpecName "kube-api-access-d8qwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.117444 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6205b916-e513-4480-9774-1076d139dafb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6205b916-e513-4480-9774-1076d139dafb" (UID: "6205b916-e513-4480-9774-1076d139dafb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.183115 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6205b916-e513-4480-9774-1076d139dafb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.183152 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8qwn\" (UniqueName: \"kubernetes.io/projected/6205b916-e513-4480-9774-1076d139dafb-kube-api-access-d8qwn\") on node \"crc\" DevicePath \"\"" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.497298 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smp4d" event={"ID":"6205b916-e513-4480-9774-1076d139dafb","Type":"ContainerDied","Data":"2f88d1336c3c3bf3556e3992b9e8d051738f92d8edb0f62c7c1f5af3bfa89ddc"} Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.497319 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smp4d" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.497721 4964 scope.go:117] "RemoveContainer" containerID="e569465e61541c01c93ab4b2d6e2d5af6bf9f2053b0fd3a86c8769caaef3a28e" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.500338 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sg5gx" event={"ID":"a8463e9a-2c03-4cd9-9617-998e597f6497","Type":"ContainerDied","Data":"990f391061778c901c2a09f8820a896366057716de66cb7d866db41cf54bc9d0"} Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.500432 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sg5gx" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.529986 4964 scope.go:117] "RemoveContainer" containerID="de82a9dc9ebbe8d6ef856a64275b8d52ee6a865f849b50a237d1b8440c17478a" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.578425 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sg5gx"] Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.605062 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sg5gx"] Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.610102 4964 scope.go:117] "RemoveContainer" containerID="f82b1495a560c3489bc0ec88689606cc1dfcdf196c4e158e6b05009f68aab533" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.659742 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-smp4d"] Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.667440 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-smp4d"] Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.719213 4964 scope.go:117] "RemoveContainer" containerID="4a841147971f4f242c05f303185fc951afa3aaf1b2c42cba4c5134b2a7387d0b" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.760126 4964 scope.go:117] "RemoveContainer" containerID="6e41b86a25ca6a9093b6df30cab5af878f7b886efe92473b1dc7c40f9927e882" Oct 04 03:53:59 crc kubenswrapper[4964]: I1004 03:53:59.780514 4964 scope.go:117] "RemoveContainer" containerID="725e04c44ede104846ca1e88340d603fb8ed76709ea08ddbf4acb95e5d07e4ea" Oct 04 03:54:00 crc kubenswrapper[4964]: I1004 03:54:00.859857 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6205b916-e513-4480-9774-1076d139dafb" path="/var/lib/kubelet/pods/6205b916-e513-4480-9774-1076d139dafb/volumes" Oct 04 03:54:00 crc kubenswrapper[4964]: I1004 03:54:00.861374 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8463e9a-2c03-4cd9-9617-998e597f6497" path="/var/lib/kubelet/pods/a8463e9a-2c03-4cd9-9617-998e597f6497/volumes" Oct 04 03:54:04 crc kubenswrapper[4964]: I1004 03:54:04.855214 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:54:04 crc kubenswrapper[4964]: E1004 03:54:04.857689 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:54:19 crc kubenswrapper[4964]: I1004 03:54:19.846228 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:54:19 crc kubenswrapper[4964]: E1004 03:54:19.847710 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:54:31 crc kubenswrapper[4964]: I1004 03:54:31.845582 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:54:31 crc kubenswrapper[4964]: E1004 03:54:31.847806 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:54:45 crc kubenswrapper[4964]: I1004 03:54:45.847461 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:54:45 crc kubenswrapper[4964]: E1004 03:54:45.849019 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:54:56 crc kubenswrapper[4964]: I1004 03:54:56.846149 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:54:56 crc kubenswrapper[4964]: E1004 03:54:56.846938 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:55:11 crc kubenswrapper[4964]: I1004 03:55:11.845444 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:55:11 crc kubenswrapper[4964]: E1004 03:55:11.847231 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:55:22 crc kubenswrapper[4964]: I1004 03:55:22.845732 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:55:22 crc kubenswrapper[4964]: E1004 03:55:22.846946 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:55:36 crc kubenswrapper[4964]: I1004 03:55:36.845712 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:55:36 crc kubenswrapper[4964]: E1004 03:55:36.847108 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:55:47 crc kubenswrapper[4964]: I1004 03:55:47.845811 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:55:47 crc kubenswrapper[4964]: E1004 03:55:47.846586 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:55:59 crc kubenswrapper[4964]: I1004 03:55:59.845599 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:55:59 crc kubenswrapper[4964]: E1004 03:55:59.846588 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:56:11 crc kubenswrapper[4964]: I1004 03:56:11.845750 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:56:11 crc kubenswrapper[4964]: E1004 03:56:11.846606 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:56:22 crc kubenswrapper[4964]: I1004 03:56:22.845270 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:56:22 crc kubenswrapper[4964]: E1004 03:56:22.846374 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:56:30 crc kubenswrapper[4964]: I1004 03:56:30.652828 4964 scope.go:117] "RemoveContainer" containerID="e6411feb46cf459b2038facf8662c78f813f8d71eafa980967b970d54f0e1c1a" Oct 04 03:56:30 crc kubenswrapper[4964]: I1004 03:56:30.684429 4964 scope.go:117] "RemoveContainer" containerID="1cb998dac3fa00ea3c0c8314085080ac291c72265d57f64eeecef7dd3b668294" Oct 04 03:56:30 crc kubenswrapper[4964]: I1004 03:56:30.707936 4964 scope.go:117] "RemoveContainer" containerID="aa17bd043e60cf7304568e31ae7a506e83127d7546d22d7e7173fdd620b89de5" Oct 04 03:56:33 crc kubenswrapper[4964]: I1004 03:56:33.846302 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:56:33 crc kubenswrapper[4964]: E1004 03:56:33.847114 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 03:56:47 crc kubenswrapper[4964]: I1004 03:56:47.845901 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 03:56:48 crc kubenswrapper[4964]: I1004 03:56:48.416383 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"7acd97a2d943148214dec2a1e45b876bb35371b2538d34908344419f0e83dc2c"} Oct 04 03:59:04 crc kubenswrapper[4964]: I1004 03:59:04.449073 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:59:04 crc kubenswrapper[4964]: I1004 03:59:04.449828 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 03:59:34 crc kubenswrapper[4964]: I1004 03:59:34.449410 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 03:59:34 crc kubenswrapper[4964]: I1004 03:59:34.450076 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.174861 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5"] Oct 04 04:00:00 crc kubenswrapper[4964]: E1004 04:00:00.176121 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8463e9a-2c03-4cd9-9617-998e597f6497" containerName="extract-utilities" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.176146 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8463e9a-2c03-4cd9-9617-998e597f6497" containerName="extract-utilities" Oct 04 04:00:00 crc kubenswrapper[4964]: E1004 04:00:00.176170 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8463e9a-2c03-4cd9-9617-998e597f6497" containerName="registry-server" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.176183 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8463e9a-2c03-4cd9-9617-998e597f6497" containerName="registry-server" Oct 04 04:00:00 crc kubenswrapper[4964]: E1004 04:00:00.176231 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6205b916-e513-4480-9774-1076d139dafb" containerName="extract-content" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.176243 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6205b916-e513-4480-9774-1076d139dafb" containerName="extract-content" Oct 04 04:00:00 crc kubenswrapper[4964]: E1004 04:00:00.176272 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6205b916-e513-4480-9774-1076d139dafb" containerName="registry-server" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.176284 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6205b916-e513-4480-9774-1076d139dafb" containerName="registry-server" Oct 04 04:00:00 crc kubenswrapper[4964]: E1004 04:00:00.176302 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8463e9a-2c03-4cd9-9617-998e597f6497" containerName="extract-content" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.176314 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8463e9a-2c03-4cd9-9617-998e597f6497" containerName="extract-content" Oct 04 04:00:00 crc kubenswrapper[4964]: E1004 04:00:00.176333 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6205b916-e513-4480-9774-1076d139dafb" containerName="extract-utilities" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.176345 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6205b916-e513-4480-9774-1076d139dafb" containerName="extract-utilities" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.176718 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6205b916-e513-4480-9774-1076d139dafb" containerName="registry-server" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.176766 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8463e9a-2c03-4cd9-9617-998e597f6497" containerName="registry-server" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.178016 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.182153 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.182490 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.193725 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5"] Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.276786 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6s7g\" (UniqueName: \"kubernetes.io/projected/82ff352b-58da-4327-8d38-7ecb34a4058e-kube-api-access-h6s7g\") pod \"collect-profiles-29325840-gsdn5\" (UID: \"82ff352b-58da-4327-8d38-7ecb34a4058e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.276867 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82ff352b-58da-4327-8d38-7ecb34a4058e-config-volume\") pod \"collect-profiles-29325840-gsdn5\" (UID: \"82ff352b-58da-4327-8d38-7ecb34a4058e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.277237 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82ff352b-58da-4327-8d38-7ecb34a4058e-secret-volume\") pod \"collect-profiles-29325840-gsdn5\" (UID: \"82ff352b-58da-4327-8d38-7ecb34a4058e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.379982 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82ff352b-58da-4327-8d38-7ecb34a4058e-secret-volume\") pod \"collect-profiles-29325840-gsdn5\" (UID: \"82ff352b-58da-4327-8d38-7ecb34a4058e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.380146 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6s7g\" (UniqueName: \"kubernetes.io/projected/82ff352b-58da-4327-8d38-7ecb34a4058e-kube-api-access-h6s7g\") pod \"collect-profiles-29325840-gsdn5\" (UID: \"82ff352b-58da-4327-8d38-7ecb34a4058e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.380201 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82ff352b-58da-4327-8d38-7ecb34a4058e-config-volume\") pod \"collect-profiles-29325840-gsdn5\" (UID: \"82ff352b-58da-4327-8d38-7ecb34a4058e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.381394 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82ff352b-58da-4327-8d38-7ecb34a4058e-config-volume\") pod \"collect-profiles-29325840-gsdn5\" (UID: \"82ff352b-58da-4327-8d38-7ecb34a4058e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.992557 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6s7g\" (UniqueName: \"kubernetes.io/projected/82ff352b-58da-4327-8d38-7ecb34a4058e-kube-api-access-h6s7g\") pod \"collect-profiles-29325840-gsdn5\" (UID: \"82ff352b-58da-4327-8d38-7ecb34a4058e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:00 crc kubenswrapper[4964]: I1004 04:00:00.997341 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82ff352b-58da-4327-8d38-7ecb34a4058e-secret-volume\") pod \"collect-profiles-29325840-gsdn5\" (UID: \"82ff352b-58da-4327-8d38-7ecb34a4058e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:01 crc kubenswrapper[4964]: I1004 04:00:01.111476 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:01 crc kubenswrapper[4964]: I1004 04:00:01.630683 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5"] Oct 04 04:00:01 crc kubenswrapper[4964]: W1004 04:00:01.641341 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82ff352b_58da_4327_8d38_7ecb34a4058e.slice/crio-ed85d574f1894f82e06557686830fbcec6a3ae1d970bb64b3bb4eaa1de318733 WatchSource:0}: Error finding container ed85d574f1894f82e06557686830fbcec6a3ae1d970bb64b3bb4eaa1de318733: Status 404 returned error can't find the container with id ed85d574f1894f82e06557686830fbcec6a3ae1d970bb64b3bb4eaa1de318733 Oct 04 04:00:02 crc kubenswrapper[4964]: I1004 04:00:02.656984 4964 generic.go:334] "Generic (PLEG): container finished" podID="82ff352b-58da-4327-8d38-7ecb34a4058e" containerID="e7f2727bedfd74553c698acd2ac6a5a3f919aaa3cb0ba9dc7713068ac1feeaf2" exitCode=0 Oct 04 04:00:02 crc kubenswrapper[4964]: I1004 04:00:02.657537 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" event={"ID":"82ff352b-58da-4327-8d38-7ecb34a4058e","Type":"ContainerDied","Data":"e7f2727bedfd74553c698acd2ac6a5a3f919aaa3cb0ba9dc7713068ac1feeaf2"} Oct 04 04:00:02 crc kubenswrapper[4964]: I1004 04:00:02.657720 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" event={"ID":"82ff352b-58da-4327-8d38-7ecb34a4058e","Type":"ContainerStarted","Data":"ed85d574f1894f82e06557686830fbcec6a3ae1d970bb64b3bb4eaa1de318733"} Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.096324 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.166417 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82ff352b-58da-4327-8d38-7ecb34a4058e-secret-volume\") pod \"82ff352b-58da-4327-8d38-7ecb34a4058e\" (UID: \"82ff352b-58da-4327-8d38-7ecb34a4058e\") " Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.172874 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82ff352b-58da-4327-8d38-7ecb34a4058e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "82ff352b-58da-4327-8d38-7ecb34a4058e" (UID: "82ff352b-58da-4327-8d38-7ecb34a4058e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.268588 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82ff352b-58da-4327-8d38-7ecb34a4058e-config-volume\") pod \"82ff352b-58da-4327-8d38-7ecb34a4058e\" (UID: \"82ff352b-58da-4327-8d38-7ecb34a4058e\") " Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.268681 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6s7g\" (UniqueName: \"kubernetes.io/projected/82ff352b-58da-4327-8d38-7ecb34a4058e-kube-api-access-h6s7g\") pod \"82ff352b-58da-4327-8d38-7ecb34a4058e\" (UID: \"82ff352b-58da-4327-8d38-7ecb34a4058e\") " Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.269333 4964 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82ff352b-58da-4327-8d38-7ecb34a4058e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.269319 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82ff352b-58da-4327-8d38-7ecb34a4058e-config-volume" (OuterVolumeSpecName: "config-volume") pod "82ff352b-58da-4327-8d38-7ecb34a4058e" (UID: "82ff352b-58da-4327-8d38-7ecb34a4058e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.273451 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82ff352b-58da-4327-8d38-7ecb34a4058e-kube-api-access-h6s7g" (OuterVolumeSpecName: "kube-api-access-h6s7g") pod "82ff352b-58da-4327-8d38-7ecb34a4058e" (UID: "82ff352b-58da-4327-8d38-7ecb34a4058e"). InnerVolumeSpecName "kube-api-access-h6s7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.371698 4964 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82ff352b-58da-4327-8d38-7ecb34a4058e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.371997 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6s7g\" (UniqueName: \"kubernetes.io/projected/82ff352b-58da-4327-8d38-7ecb34a4058e-kube-api-access-h6s7g\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.448699 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.448761 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.448805 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.449510 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7acd97a2d943148214dec2a1e45b876bb35371b2538d34908344419f0e83dc2c"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.449558 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://7acd97a2d943148214dec2a1e45b876bb35371b2538d34908344419f0e83dc2c" gracePeriod=600 Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.674434 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" event={"ID":"82ff352b-58da-4327-8d38-7ecb34a4058e","Type":"ContainerDied","Data":"ed85d574f1894f82e06557686830fbcec6a3ae1d970bb64b3bb4eaa1de318733"} Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.674474 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed85d574f1894f82e06557686830fbcec6a3ae1d970bb64b3bb4eaa1de318733" Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.674451 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325840-gsdn5" Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.679146 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="7acd97a2d943148214dec2a1e45b876bb35371b2538d34908344419f0e83dc2c" exitCode=0 Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.679184 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"7acd97a2d943148214dec2a1e45b876bb35371b2538d34908344419f0e83dc2c"} Oct 04 04:00:04 crc kubenswrapper[4964]: I1004 04:00:04.679214 4964 scope.go:117] "RemoveContainer" containerID="aaf148c2627f16856894057850413cb1958f161f08f37e0d8e5b4cd85b755d29" Oct 04 04:00:05 crc kubenswrapper[4964]: I1004 04:00:05.182118 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v"] Oct 04 04:00:05 crc kubenswrapper[4964]: I1004 04:00:05.188867 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325795-2p95v"] Oct 04 04:00:05 crc kubenswrapper[4964]: I1004 04:00:05.694295 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c"} Oct 04 04:00:06 crc kubenswrapper[4964]: I1004 04:00:06.865280 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23077962-587a-4a71-9672-222d7348d24f" path="/var/lib/kubelet/pods/23077962-587a-4a71-9672-222d7348d24f/volumes" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.094804 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-npgsz"] Oct 04 04:00:26 crc kubenswrapper[4964]: E1004 04:00:26.096054 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82ff352b-58da-4327-8d38-7ecb34a4058e" containerName="collect-profiles" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.096074 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ff352b-58da-4327-8d38-7ecb34a4058e" containerName="collect-profiles" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.096491 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="82ff352b-58da-4327-8d38-7ecb34a4058e" containerName="collect-profiles" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.114591 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npgsz"] Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.114757 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.279695 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nzmg\" (UniqueName: \"kubernetes.io/projected/6bcd2df1-0186-4eba-aace-d725fc763d74-kube-api-access-5nzmg\") pod \"redhat-operators-npgsz\" (UID: \"6bcd2df1-0186-4eba-aace-d725fc763d74\") " pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.280597 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bcd2df1-0186-4eba-aace-d725fc763d74-catalog-content\") pod \"redhat-operators-npgsz\" (UID: \"6bcd2df1-0186-4eba-aace-d725fc763d74\") " pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.281057 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bcd2df1-0186-4eba-aace-d725fc763d74-utilities\") pod \"redhat-operators-npgsz\" (UID: \"6bcd2df1-0186-4eba-aace-d725fc763d74\") " pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.382748 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bcd2df1-0186-4eba-aace-d725fc763d74-catalog-content\") pod \"redhat-operators-npgsz\" (UID: \"6bcd2df1-0186-4eba-aace-d725fc763d74\") " pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.382870 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bcd2df1-0186-4eba-aace-d725fc763d74-utilities\") pod \"redhat-operators-npgsz\" (UID: \"6bcd2df1-0186-4eba-aace-d725fc763d74\") " pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.382967 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nzmg\" (UniqueName: \"kubernetes.io/projected/6bcd2df1-0186-4eba-aace-d725fc763d74-kube-api-access-5nzmg\") pod \"redhat-operators-npgsz\" (UID: \"6bcd2df1-0186-4eba-aace-d725fc763d74\") " pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.384093 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bcd2df1-0186-4eba-aace-d725fc763d74-catalog-content\") pod \"redhat-operators-npgsz\" (UID: \"6bcd2df1-0186-4eba-aace-d725fc763d74\") " pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.384394 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bcd2df1-0186-4eba-aace-d725fc763d74-utilities\") pod \"redhat-operators-npgsz\" (UID: \"6bcd2df1-0186-4eba-aace-d725fc763d74\") " pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.414947 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nzmg\" (UniqueName: \"kubernetes.io/projected/6bcd2df1-0186-4eba-aace-d725fc763d74-kube-api-access-5nzmg\") pod \"redhat-operators-npgsz\" (UID: \"6bcd2df1-0186-4eba-aace-d725fc763d74\") " pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.454800 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:26 crc kubenswrapper[4964]: I1004 04:00:26.943301 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npgsz"] Oct 04 04:00:27 crc kubenswrapper[4964]: I1004 04:00:27.939585 4964 generic.go:334] "Generic (PLEG): container finished" podID="6bcd2df1-0186-4eba-aace-d725fc763d74" containerID="2e98e872e0700945d3f10c84b1a6a6398966998cfc2c54f437e8735bc29e0459" exitCode=0 Oct 04 04:00:27 crc kubenswrapper[4964]: I1004 04:00:27.939687 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgsz" event={"ID":"6bcd2df1-0186-4eba-aace-d725fc763d74","Type":"ContainerDied","Data":"2e98e872e0700945d3f10c84b1a6a6398966998cfc2c54f437e8735bc29e0459"} Oct 04 04:00:27 crc kubenswrapper[4964]: I1004 04:00:27.940308 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgsz" event={"ID":"6bcd2df1-0186-4eba-aace-d725fc763d74","Type":"ContainerStarted","Data":"3a7ab3e24b0ee7dc6ed8a97ea5be6c7300f1c1a140b526a99a32228cc6eaaf1e"} Oct 04 04:00:27 crc kubenswrapper[4964]: I1004 04:00:27.941916 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 04:00:28 crc kubenswrapper[4964]: I1004 04:00:28.958447 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgsz" event={"ID":"6bcd2df1-0186-4eba-aace-d725fc763d74","Type":"ContainerStarted","Data":"0e6bc7da4d28d25c4210cc12d891f9579c9c61ab7b461743c5e97b80a4303234"} Oct 04 04:00:29 crc kubenswrapper[4964]: I1004 04:00:29.973233 4964 generic.go:334] "Generic (PLEG): container finished" podID="a803f4c5-eef4-426a-acea-039c19405797" containerID="e782f5ae2bd13361426f77ba3c07203a64c14fe2285c79ede6317d265ecb4239" exitCode=1 Oct 04 04:00:29 crc kubenswrapper[4964]: I1004 04:00:29.973363 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a803f4c5-eef4-426a-acea-039c19405797","Type":"ContainerDied","Data":"e782f5ae2bd13361426f77ba3c07203a64c14fe2285c79ede6317d265ecb4239"} Oct 04 04:00:30 crc kubenswrapper[4964]: I1004 04:00:30.845753 4964 scope.go:117] "RemoveContainer" containerID="8af6b7d7aa13c6bfe0d1aaf4da6590fdb82dc9075245c051a7224278065c4a69" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.432424 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.462379 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jn4xh"] Oct 04 04:00:31 crc kubenswrapper[4964]: E1004 04:00:31.462973 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a803f4c5-eef4-426a-acea-039c19405797" containerName="tempest-tests-tempest-tests-runner" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.463003 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a803f4c5-eef4-426a-acea-039c19405797" containerName="tempest-tests-tempest-tests-runner" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.463411 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="a803f4c5-eef4-426a-acea-039c19405797" containerName="tempest-tests-tempest-tests-runner" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.474441 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.495487 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jn4xh"] Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.596196 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-openstack-config-secret\") pod \"a803f4c5-eef4-426a-acea-039c19405797\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.596276 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a803f4c5-eef4-426a-acea-039c19405797-test-operator-ephemeral-temporary\") pod \"a803f4c5-eef4-426a-acea-039c19405797\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.596390 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a803f4c5-eef4-426a-acea-039c19405797-openstack-config\") pod \"a803f4c5-eef4-426a-acea-039c19405797\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.596476 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"a803f4c5-eef4-426a-acea-039c19405797\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.596530 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a803f4c5-eef4-426a-acea-039c19405797-test-operator-ephemeral-workdir\") pod \"a803f4c5-eef4-426a-acea-039c19405797\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.596581 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-ca-certs\") pod \"a803f4c5-eef4-426a-acea-039c19405797\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.596660 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-599ws\" (UniqueName: \"kubernetes.io/projected/a803f4c5-eef4-426a-acea-039c19405797-kube-api-access-599ws\") pod \"a803f4c5-eef4-426a-acea-039c19405797\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.596701 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-ssh-key\") pod \"a803f4c5-eef4-426a-acea-039c19405797\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.596803 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a803f4c5-eef4-426a-acea-039c19405797-config-data\") pod \"a803f4c5-eef4-426a-acea-039c19405797\" (UID: \"a803f4c5-eef4-426a-acea-039c19405797\") " Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.597137 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a803f4c5-eef4-426a-acea-039c19405797-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "a803f4c5-eef4-426a-acea-039c19405797" (UID: "a803f4c5-eef4-426a-acea-039c19405797"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.597149 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9818c7a-a4b7-414a-9dde-40b4951490f9-catalog-content\") pod \"redhat-marketplace-jn4xh\" (UID: \"a9818c7a-a4b7-414a-9dde-40b4951490f9\") " pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.597645 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlrvv\" (UniqueName: \"kubernetes.io/projected/a9818c7a-a4b7-414a-9dde-40b4951490f9-kube-api-access-xlrvv\") pod \"redhat-marketplace-jn4xh\" (UID: \"a9818c7a-a4b7-414a-9dde-40b4951490f9\") " pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.597929 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9818c7a-a4b7-414a-9dde-40b4951490f9-utilities\") pod \"redhat-marketplace-jn4xh\" (UID: \"a9818c7a-a4b7-414a-9dde-40b4951490f9\") " pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.598047 4964 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a803f4c5-eef4-426a-acea-039c19405797-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.598773 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a803f4c5-eef4-426a-acea-039c19405797-config-data" (OuterVolumeSpecName: "config-data") pod "a803f4c5-eef4-426a-acea-039c19405797" (UID: "a803f4c5-eef4-426a-acea-039c19405797"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.603480 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "a803f4c5-eef4-426a-acea-039c19405797" (UID: "a803f4c5-eef4-426a-acea-039c19405797"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.604183 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a803f4c5-eef4-426a-acea-039c19405797-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "a803f4c5-eef4-426a-acea-039c19405797" (UID: "a803f4c5-eef4-426a-acea-039c19405797"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.606686 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a803f4c5-eef4-426a-acea-039c19405797-kube-api-access-599ws" (OuterVolumeSpecName: "kube-api-access-599ws") pod "a803f4c5-eef4-426a-acea-039c19405797" (UID: "a803f4c5-eef4-426a-acea-039c19405797"). InnerVolumeSpecName "kube-api-access-599ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.643300 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "a803f4c5-eef4-426a-acea-039c19405797" (UID: "a803f4c5-eef4-426a-acea-039c19405797"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.647892 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a803f4c5-eef4-426a-acea-039c19405797" (UID: "a803f4c5-eef4-426a-acea-039c19405797"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.651734 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a803f4c5-eef4-426a-acea-039c19405797" (UID: "a803f4c5-eef4-426a-acea-039c19405797"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.656026 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a803f4c5-eef4-426a-acea-039c19405797-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a803f4c5-eef4-426a-acea-039c19405797" (UID: "a803f4c5-eef4-426a-acea-039c19405797"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700070 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlrvv\" (UniqueName: \"kubernetes.io/projected/a9818c7a-a4b7-414a-9dde-40b4951490f9-kube-api-access-xlrvv\") pod \"redhat-marketplace-jn4xh\" (UID: \"a9818c7a-a4b7-414a-9dde-40b4951490f9\") " pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700210 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9818c7a-a4b7-414a-9dde-40b4951490f9-utilities\") pod \"redhat-marketplace-jn4xh\" (UID: \"a9818c7a-a4b7-414a-9dde-40b4951490f9\") " pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700263 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9818c7a-a4b7-414a-9dde-40b4951490f9-catalog-content\") pod \"redhat-marketplace-jn4xh\" (UID: \"a9818c7a-a4b7-414a-9dde-40b4951490f9\") " pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700371 4964 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a803f4c5-eef4-426a-acea-039c19405797-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700404 4964 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700420 4964 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a803f4c5-eef4-426a-acea-039c19405797-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700435 4964 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700449 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-599ws\" (UniqueName: \"kubernetes.io/projected/a803f4c5-eef4-426a-acea-039c19405797-kube-api-access-599ws\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700462 4964 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700473 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a803f4c5-eef4-426a-acea-039c19405797-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700484 4964 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a803f4c5-eef4-426a-acea-039c19405797-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700917 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9818c7a-a4b7-414a-9dde-40b4951490f9-utilities\") pod \"redhat-marketplace-jn4xh\" (UID: \"a9818c7a-a4b7-414a-9dde-40b4951490f9\") " pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.700944 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9818c7a-a4b7-414a-9dde-40b4951490f9-catalog-content\") pod \"redhat-marketplace-jn4xh\" (UID: \"a9818c7a-a4b7-414a-9dde-40b4951490f9\") " pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.722748 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlrvv\" (UniqueName: \"kubernetes.io/projected/a9818c7a-a4b7-414a-9dde-40b4951490f9-kube-api-access-xlrvv\") pod \"redhat-marketplace-jn4xh\" (UID: \"a9818c7a-a4b7-414a-9dde-40b4951490f9\") " pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.727498 4964 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.802347 4964 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.802822 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.995952 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a803f4c5-eef4-426a-acea-039c19405797","Type":"ContainerDied","Data":"62cc3f35109656f97b1fab54c1108e7ebd7b8e2720c2102e6c5d9f5c6acf8b1b"} Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.996017 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62cc3f35109656f97b1fab54c1108e7ebd7b8e2720c2102e6c5d9f5c6acf8b1b" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.996085 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.999457 4964 generic.go:334] "Generic (PLEG): container finished" podID="6bcd2df1-0186-4eba-aace-d725fc763d74" containerID="0e6bc7da4d28d25c4210cc12d891f9579c9c61ab7b461743c5e97b80a4303234" exitCode=0 Oct 04 04:00:31 crc kubenswrapper[4964]: I1004 04:00:31.999506 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgsz" event={"ID":"6bcd2df1-0186-4eba-aace-d725fc763d74","Type":"ContainerDied","Data":"0e6bc7da4d28d25c4210cc12d891f9579c9c61ab7b461743c5e97b80a4303234"} Oct 04 04:00:32 crc kubenswrapper[4964]: I1004 04:00:32.319161 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jn4xh"] Oct 04 04:00:33 crc kubenswrapper[4964]: I1004 04:00:33.011756 4964 generic.go:334] "Generic (PLEG): container finished" podID="a9818c7a-a4b7-414a-9dde-40b4951490f9" containerID="fc58b064e51160f063cdc00cdafe5851faa27aa0147bf866e36679d6c658c1f7" exitCode=0 Oct 04 04:00:33 crc kubenswrapper[4964]: I1004 04:00:33.012688 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn4xh" event={"ID":"a9818c7a-a4b7-414a-9dde-40b4951490f9","Type":"ContainerDied","Data":"fc58b064e51160f063cdc00cdafe5851faa27aa0147bf866e36679d6c658c1f7"} Oct 04 04:00:33 crc kubenswrapper[4964]: I1004 04:00:33.012741 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn4xh" event={"ID":"a9818c7a-a4b7-414a-9dde-40b4951490f9","Type":"ContainerStarted","Data":"b230ab4c2d6c6876cfe19eabe13f952ab93fb658abfb20ee829fa605b3c6c7fe"} Oct 04 04:00:33 crc kubenswrapper[4964]: I1004 04:00:33.017660 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgsz" event={"ID":"6bcd2df1-0186-4eba-aace-d725fc763d74","Type":"ContainerStarted","Data":"e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579"} Oct 04 04:00:33 crc kubenswrapper[4964]: I1004 04:00:33.092022 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-npgsz" podStartSLOduration=2.385692019 podStartE2EDuration="7.091999552s" podCreationTimestamp="2025-10-04 04:00:26 +0000 UTC" firstStartedPulling="2025-10-04 04:00:27.941417835 +0000 UTC m=+4807.838376503" lastFinishedPulling="2025-10-04 04:00:32.647725388 +0000 UTC m=+4812.544684036" observedRunningTime="2025-10-04 04:00:33.060842274 +0000 UTC m=+4812.957800922" watchObservedRunningTime="2025-10-04 04:00:33.091999552 +0000 UTC m=+4812.988958180" Oct 04 04:00:34 crc kubenswrapper[4964]: I1004 04:00:34.035930 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn4xh" event={"ID":"a9818c7a-a4b7-414a-9dde-40b4951490f9","Type":"ContainerStarted","Data":"887ddd8f0a5aef4d637f4927dae5d7009a54eed1e222c4fab9e519f9601533df"} Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.048483 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.050512 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.052768 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-tbt6v" Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.062858 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.078945 4964 generic.go:334] "Generic (PLEG): container finished" podID="a9818c7a-a4b7-414a-9dde-40b4951490f9" containerID="887ddd8f0a5aef4d637f4927dae5d7009a54eed1e222c4fab9e519f9601533df" exitCode=0 Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.078990 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn4xh" event={"ID":"a9818c7a-a4b7-414a-9dde-40b4951490f9","Type":"ContainerDied","Data":"887ddd8f0a5aef4d637f4927dae5d7009a54eed1e222c4fab9e519f9601533df"} Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.210829 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7l79\" (UniqueName: \"kubernetes.io/projected/84508e92-e7bb-4e60-b695-cfdb38605416-kube-api-access-n7l79\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"84508e92-e7bb-4e60-b695-cfdb38605416\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.210925 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"84508e92-e7bb-4e60-b695-cfdb38605416\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.312453 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7l79\" (UniqueName: \"kubernetes.io/projected/84508e92-e7bb-4e60-b695-cfdb38605416-kube-api-access-n7l79\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"84508e92-e7bb-4e60-b695-cfdb38605416\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.312548 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"84508e92-e7bb-4e60-b695-cfdb38605416\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.313333 4964 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"84508e92-e7bb-4e60-b695-cfdb38605416\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.343272 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7l79\" (UniqueName: \"kubernetes.io/projected/84508e92-e7bb-4e60-b695-cfdb38605416-kube-api-access-n7l79\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"84508e92-e7bb-4e60-b695-cfdb38605416\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.373713 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"84508e92-e7bb-4e60-b695-cfdb38605416\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.403044 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.455718 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.457686 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:36 crc kubenswrapper[4964]: I1004 04:00:36.925197 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 04 04:00:36 crc kubenswrapper[4964]: W1004 04:00:36.927683 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84508e92_e7bb_4e60_b695_cfdb38605416.slice/crio-3eab417f761486232db90b24a32cb514e709d2e9d62e3aa87904263d4e24877a WatchSource:0}: Error finding container 3eab417f761486232db90b24a32cb514e709d2e9d62e3aa87904263d4e24877a: Status 404 returned error can't find the container with id 3eab417f761486232db90b24a32cb514e709d2e9d62e3aa87904263d4e24877a Oct 04 04:00:37 crc kubenswrapper[4964]: I1004 04:00:37.094364 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn4xh" event={"ID":"a9818c7a-a4b7-414a-9dde-40b4951490f9","Type":"ContainerStarted","Data":"916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b"} Oct 04 04:00:37 crc kubenswrapper[4964]: I1004 04:00:37.097639 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"84508e92-e7bb-4e60-b695-cfdb38605416","Type":"ContainerStarted","Data":"3eab417f761486232db90b24a32cb514e709d2e9d62e3aa87904263d4e24877a"} Oct 04 04:00:37 crc kubenswrapper[4964]: I1004 04:00:37.123198 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jn4xh" podStartSLOduration=2.441492305 podStartE2EDuration="6.123182544s" podCreationTimestamp="2025-10-04 04:00:31 +0000 UTC" firstStartedPulling="2025-10-04 04:00:33.015165619 +0000 UTC m=+4812.912124267" lastFinishedPulling="2025-10-04 04:00:36.696855868 +0000 UTC m=+4816.593814506" observedRunningTime="2025-10-04 04:00:37.117839992 +0000 UTC m=+4817.014798630" watchObservedRunningTime="2025-10-04 04:00:37.123182544 +0000 UTC m=+4817.020141182" Oct 04 04:00:37 crc kubenswrapper[4964]: I1004 04:00:37.533292 4964 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-npgsz" podUID="6bcd2df1-0186-4eba-aace-d725fc763d74" containerName="registry-server" probeResult="failure" output=< Oct 04 04:00:37 crc kubenswrapper[4964]: timeout: failed to connect service ":50051" within 1s Oct 04 04:00:37 crc kubenswrapper[4964]: > Oct 04 04:00:38 crc kubenswrapper[4964]: I1004 04:00:38.110365 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"84508e92-e7bb-4e60-b695-cfdb38605416","Type":"ContainerStarted","Data":"d64be63b54219a688c14c484377a910b06a90b76616329b7d9b4718932f70da4"} Oct 04 04:00:41 crc kubenswrapper[4964]: I1004 04:00:41.803765 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:41 crc kubenswrapper[4964]: I1004 04:00:41.804523 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:41 crc kubenswrapper[4964]: I1004 04:00:41.899285 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:41 crc kubenswrapper[4964]: I1004 04:00:41.953003 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=5.047804702 podStartE2EDuration="5.952961471s" podCreationTimestamp="2025-10-04 04:00:36 +0000 UTC" firstStartedPulling="2025-10-04 04:00:36.930013207 +0000 UTC m=+4816.826971885" lastFinishedPulling="2025-10-04 04:00:37.835170016 +0000 UTC m=+4817.732128654" observedRunningTime="2025-10-04 04:00:38.131777673 +0000 UTC m=+4818.028736321" watchObservedRunningTime="2025-10-04 04:00:41.952961471 +0000 UTC m=+4821.849920159" Oct 04 04:00:42 crc kubenswrapper[4964]: I1004 04:00:42.233789 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:42 crc kubenswrapper[4964]: I1004 04:00:42.296372 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jn4xh"] Oct 04 04:00:44 crc kubenswrapper[4964]: I1004 04:00:44.194336 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jn4xh" podUID="a9818c7a-a4b7-414a-9dde-40b4951490f9" containerName="registry-server" containerID="cri-o://916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b" gracePeriod=2 Oct 04 04:00:44 crc kubenswrapper[4964]: I1004 04:00:44.690705 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:44 crc kubenswrapper[4964]: I1004 04:00:44.866305 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9818c7a-a4b7-414a-9dde-40b4951490f9-utilities\") pod \"a9818c7a-a4b7-414a-9dde-40b4951490f9\" (UID: \"a9818c7a-a4b7-414a-9dde-40b4951490f9\") " Oct 04 04:00:44 crc kubenswrapper[4964]: I1004 04:00:44.866953 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlrvv\" (UniqueName: \"kubernetes.io/projected/a9818c7a-a4b7-414a-9dde-40b4951490f9-kube-api-access-xlrvv\") pod \"a9818c7a-a4b7-414a-9dde-40b4951490f9\" (UID: \"a9818c7a-a4b7-414a-9dde-40b4951490f9\") " Oct 04 04:00:44 crc kubenswrapper[4964]: I1004 04:00:44.868034 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9818c7a-a4b7-414a-9dde-40b4951490f9-catalog-content\") pod \"a9818c7a-a4b7-414a-9dde-40b4951490f9\" (UID: \"a9818c7a-a4b7-414a-9dde-40b4951490f9\") " Oct 04 04:00:44 crc kubenswrapper[4964]: I1004 04:00:44.868327 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9818c7a-a4b7-414a-9dde-40b4951490f9-utilities" (OuterVolumeSpecName: "utilities") pod "a9818c7a-a4b7-414a-9dde-40b4951490f9" (UID: "a9818c7a-a4b7-414a-9dde-40b4951490f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:00:44 crc kubenswrapper[4964]: I1004 04:00:44.876285 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9818c7a-a4b7-414a-9dde-40b4951490f9-kube-api-access-xlrvv" (OuterVolumeSpecName: "kube-api-access-xlrvv") pod "a9818c7a-a4b7-414a-9dde-40b4951490f9" (UID: "a9818c7a-a4b7-414a-9dde-40b4951490f9"). InnerVolumeSpecName "kube-api-access-xlrvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:00:44 crc kubenswrapper[4964]: I1004 04:00:44.898162 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9818c7a-a4b7-414a-9dde-40b4951490f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9818c7a-a4b7-414a-9dde-40b4951490f9" (UID: "a9818c7a-a4b7-414a-9dde-40b4951490f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:00:44 crc kubenswrapper[4964]: I1004 04:00:44.972026 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9818c7a-a4b7-414a-9dde-40b4951490f9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:44 crc kubenswrapper[4964]: I1004 04:00:44.972090 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9818c7a-a4b7-414a-9dde-40b4951490f9-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:44 crc kubenswrapper[4964]: I1004 04:00:44.972117 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlrvv\" (UniqueName: \"kubernetes.io/projected/a9818c7a-a4b7-414a-9dde-40b4951490f9-kube-api-access-xlrvv\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.211286 4964 generic.go:334] "Generic (PLEG): container finished" podID="a9818c7a-a4b7-414a-9dde-40b4951490f9" containerID="916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b" exitCode=0 Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.211344 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn4xh" event={"ID":"a9818c7a-a4b7-414a-9dde-40b4951490f9","Type":"ContainerDied","Data":"916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b"} Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.211378 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jn4xh" Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.211406 4964 scope.go:117] "RemoveContainer" containerID="916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b" Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.211388 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn4xh" event={"ID":"a9818c7a-a4b7-414a-9dde-40b4951490f9","Type":"ContainerDied","Data":"b230ab4c2d6c6876cfe19eabe13f952ab93fb658abfb20ee829fa605b3c6c7fe"} Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.257477 4964 scope.go:117] "RemoveContainer" containerID="887ddd8f0a5aef4d637f4927dae5d7009a54eed1e222c4fab9e519f9601533df" Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.293387 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jn4xh"] Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.313131 4964 scope.go:117] "RemoveContainer" containerID="fc58b064e51160f063cdc00cdafe5851faa27aa0147bf866e36679d6c658c1f7" Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.313840 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jn4xh"] Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.359827 4964 scope.go:117] "RemoveContainer" containerID="916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b" Oct 04 04:00:45 crc kubenswrapper[4964]: E1004 04:00:45.360389 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b\": container with ID starting with 916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b not found: ID does not exist" containerID="916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b" Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.360460 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b"} err="failed to get container status \"916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b\": rpc error: code = NotFound desc = could not find container \"916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b\": container with ID starting with 916bc0d8cb33c6fa4e7a3ad8eb11635ab9c6bf7482e3fd710aa1caaca84da78b not found: ID does not exist" Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.360507 4964 scope.go:117] "RemoveContainer" containerID="887ddd8f0a5aef4d637f4927dae5d7009a54eed1e222c4fab9e519f9601533df" Oct 04 04:00:45 crc kubenswrapper[4964]: E1004 04:00:45.360915 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887ddd8f0a5aef4d637f4927dae5d7009a54eed1e222c4fab9e519f9601533df\": container with ID starting with 887ddd8f0a5aef4d637f4927dae5d7009a54eed1e222c4fab9e519f9601533df not found: ID does not exist" containerID="887ddd8f0a5aef4d637f4927dae5d7009a54eed1e222c4fab9e519f9601533df" Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.360966 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887ddd8f0a5aef4d637f4927dae5d7009a54eed1e222c4fab9e519f9601533df"} err="failed to get container status \"887ddd8f0a5aef4d637f4927dae5d7009a54eed1e222c4fab9e519f9601533df\": rpc error: code = NotFound desc = could not find container \"887ddd8f0a5aef4d637f4927dae5d7009a54eed1e222c4fab9e519f9601533df\": container with ID starting with 887ddd8f0a5aef4d637f4927dae5d7009a54eed1e222c4fab9e519f9601533df not found: ID does not exist" Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.361005 4964 scope.go:117] "RemoveContainer" containerID="fc58b064e51160f063cdc00cdafe5851faa27aa0147bf866e36679d6c658c1f7" Oct 04 04:00:45 crc kubenswrapper[4964]: E1004 04:00:45.361301 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc58b064e51160f063cdc00cdafe5851faa27aa0147bf866e36679d6c658c1f7\": container with ID starting with fc58b064e51160f063cdc00cdafe5851faa27aa0147bf866e36679d6c658c1f7 not found: ID does not exist" containerID="fc58b064e51160f063cdc00cdafe5851faa27aa0147bf866e36679d6c658c1f7" Oct 04 04:00:45 crc kubenswrapper[4964]: I1004 04:00:45.361343 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc58b064e51160f063cdc00cdafe5851faa27aa0147bf866e36679d6c658c1f7"} err="failed to get container status \"fc58b064e51160f063cdc00cdafe5851faa27aa0147bf866e36679d6c658c1f7\": rpc error: code = NotFound desc = could not find container \"fc58b064e51160f063cdc00cdafe5851faa27aa0147bf866e36679d6c658c1f7\": container with ID starting with fc58b064e51160f063cdc00cdafe5851faa27aa0147bf866e36679d6c658c1f7 not found: ID does not exist" Oct 04 04:00:46 crc kubenswrapper[4964]: I1004 04:00:46.539966 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:46 crc kubenswrapper[4964]: I1004 04:00:46.626756 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:46 crc kubenswrapper[4964]: I1004 04:00:46.865611 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9818c7a-a4b7-414a-9dde-40b4951490f9" path="/var/lib/kubelet/pods/a9818c7a-a4b7-414a-9dde-40b4951490f9/volumes" Oct 04 04:00:47 crc kubenswrapper[4964]: I1004 04:00:47.558462 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npgsz"] Oct 04 04:00:48 crc kubenswrapper[4964]: I1004 04:00:48.242805 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-npgsz" podUID="6bcd2df1-0186-4eba-aace-d725fc763d74" containerName="registry-server" containerID="cri-o://e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579" gracePeriod=2 Oct 04 04:00:48 crc kubenswrapper[4964]: I1004 04:00:48.738952 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:48 crc kubenswrapper[4964]: I1004 04:00:48.861179 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nzmg\" (UniqueName: \"kubernetes.io/projected/6bcd2df1-0186-4eba-aace-d725fc763d74-kube-api-access-5nzmg\") pod \"6bcd2df1-0186-4eba-aace-d725fc763d74\" (UID: \"6bcd2df1-0186-4eba-aace-d725fc763d74\") " Oct 04 04:00:48 crc kubenswrapper[4964]: I1004 04:00:48.861340 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bcd2df1-0186-4eba-aace-d725fc763d74-catalog-content\") pod \"6bcd2df1-0186-4eba-aace-d725fc763d74\" (UID: \"6bcd2df1-0186-4eba-aace-d725fc763d74\") " Oct 04 04:00:48 crc kubenswrapper[4964]: I1004 04:00:48.862873 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bcd2df1-0186-4eba-aace-d725fc763d74-utilities\") pod \"6bcd2df1-0186-4eba-aace-d725fc763d74\" (UID: \"6bcd2df1-0186-4eba-aace-d725fc763d74\") " Oct 04 04:00:48 crc kubenswrapper[4964]: I1004 04:00:48.863871 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bcd2df1-0186-4eba-aace-d725fc763d74-utilities" (OuterVolumeSpecName: "utilities") pod "6bcd2df1-0186-4eba-aace-d725fc763d74" (UID: "6bcd2df1-0186-4eba-aace-d725fc763d74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:00:48 crc kubenswrapper[4964]: I1004 04:00:48.864204 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bcd2df1-0186-4eba-aace-d725fc763d74-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:48 crc kubenswrapper[4964]: I1004 04:00:48.869390 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bcd2df1-0186-4eba-aace-d725fc763d74-kube-api-access-5nzmg" (OuterVolumeSpecName: "kube-api-access-5nzmg") pod "6bcd2df1-0186-4eba-aace-d725fc763d74" (UID: "6bcd2df1-0186-4eba-aace-d725fc763d74"). InnerVolumeSpecName "kube-api-access-5nzmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:00:48 crc kubenswrapper[4964]: I1004 04:00:48.957939 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bcd2df1-0186-4eba-aace-d725fc763d74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bcd2df1-0186-4eba-aace-d725fc763d74" (UID: "6bcd2df1-0186-4eba-aace-d725fc763d74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:00:48 crc kubenswrapper[4964]: I1004 04:00:48.967134 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nzmg\" (UniqueName: \"kubernetes.io/projected/6bcd2df1-0186-4eba-aace-d725fc763d74-kube-api-access-5nzmg\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:48 crc kubenswrapper[4964]: I1004 04:00:48.967164 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bcd2df1-0186-4eba-aace-d725fc763d74-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.255207 4964 generic.go:334] "Generic (PLEG): container finished" podID="6bcd2df1-0186-4eba-aace-d725fc763d74" containerID="e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579" exitCode=0 Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.255314 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgsz" event={"ID":"6bcd2df1-0186-4eba-aace-d725fc763d74","Type":"ContainerDied","Data":"e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579"} Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.255326 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npgsz" Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.256246 4964 scope.go:117] "RemoveContainer" containerID="e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579" Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.256026 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgsz" event={"ID":"6bcd2df1-0186-4eba-aace-d725fc763d74","Type":"ContainerDied","Data":"3a7ab3e24b0ee7dc6ed8a97ea5be6c7300f1c1a140b526a99a32228cc6eaaf1e"} Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.309342 4964 scope.go:117] "RemoveContainer" containerID="0e6bc7da4d28d25c4210cc12d891f9579c9c61ab7b461743c5e97b80a4303234" Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.313603 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npgsz"] Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.322827 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-npgsz"] Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.370788 4964 scope.go:117] "RemoveContainer" containerID="2e98e872e0700945d3f10c84b1a6a6398966998cfc2c54f437e8735bc29e0459" Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.415696 4964 scope.go:117] "RemoveContainer" containerID="e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579" Oct 04 04:00:49 crc kubenswrapper[4964]: E1004 04:00:49.416438 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579\": container with ID starting with e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579 not found: ID does not exist" containerID="e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579" Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.416488 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579"} err="failed to get container status \"e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579\": rpc error: code = NotFound desc = could not find container \"e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579\": container with ID starting with e6fe349dce95defb83412064094e0699147b6ea82aac8b66c7101ab161393579 not found: ID does not exist" Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.416520 4964 scope.go:117] "RemoveContainer" containerID="0e6bc7da4d28d25c4210cc12d891f9579c9c61ab7b461743c5e97b80a4303234" Oct 04 04:00:49 crc kubenswrapper[4964]: E1004 04:00:49.417020 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e6bc7da4d28d25c4210cc12d891f9579c9c61ab7b461743c5e97b80a4303234\": container with ID starting with 0e6bc7da4d28d25c4210cc12d891f9579c9c61ab7b461743c5e97b80a4303234 not found: ID does not exist" containerID="0e6bc7da4d28d25c4210cc12d891f9579c9c61ab7b461743c5e97b80a4303234" Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.417060 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e6bc7da4d28d25c4210cc12d891f9579c9c61ab7b461743c5e97b80a4303234"} err="failed to get container status \"0e6bc7da4d28d25c4210cc12d891f9579c9c61ab7b461743c5e97b80a4303234\": rpc error: code = NotFound desc = could not find container \"0e6bc7da4d28d25c4210cc12d891f9579c9c61ab7b461743c5e97b80a4303234\": container with ID starting with 0e6bc7da4d28d25c4210cc12d891f9579c9c61ab7b461743c5e97b80a4303234 not found: ID does not exist" Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.417090 4964 scope.go:117] "RemoveContainer" containerID="2e98e872e0700945d3f10c84b1a6a6398966998cfc2c54f437e8735bc29e0459" Oct 04 04:00:49 crc kubenswrapper[4964]: E1004 04:00:49.417501 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e98e872e0700945d3f10c84b1a6a6398966998cfc2c54f437e8735bc29e0459\": container with ID starting with 2e98e872e0700945d3f10c84b1a6a6398966998cfc2c54f437e8735bc29e0459 not found: ID does not exist" containerID="2e98e872e0700945d3f10c84b1a6a6398966998cfc2c54f437e8735bc29e0459" Oct 04 04:00:49 crc kubenswrapper[4964]: I1004 04:00:49.417565 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e98e872e0700945d3f10c84b1a6a6398966998cfc2c54f437e8735bc29e0459"} err="failed to get container status \"2e98e872e0700945d3f10c84b1a6a6398966998cfc2c54f437e8735bc29e0459\": rpc error: code = NotFound desc = could not find container \"2e98e872e0700945d3f10c84b1a6a6398966998cfc2c54f437e8735bc29e0459\": container with ID starting with 2e98e872e0700945d3f10c84b1a6a6398966998cfc2c54f437e8735bc29e0459 not found: ID does not exist" Oct 04 04:00:50 crc kubenswrapper[4964]: I1004 04:00:50.891008 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bcd2df1-0186-4eba-aace-d725fc763d74" path="/var/lib/kubelet/pods/6bcd2df1-0186-4eba-aace-d725fc763d74/volumes" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.161154 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29325841-5kllh"] Oct 04 04:01:00 crc kubenswrapper[4964]: E1004 04:01:00.182170 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9818c7a-a4b7-414a-9dde-40b4951490f9" containerName="registry-server" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.182221 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9818c7a-a4b7-414a-9dde-40b4951490f9" containerName="registry-server" Oct 04 04:01:00 crc kubenswrapper[4964]: E1004 04:01:00.182253 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bcd2df1-0186-4eba-aace-d725fc763d74" containerName="extract-utilities" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.182266 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bcd2df1-0186-4eba-aace-d725fc763d74" containerName="extract-utilities" Oct 04 04:01:00 crc kubenswrapper[4964]: E1004 04:01:00.182325 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9818c7a-a4b7-414a-9dde-40b4951490f9" containerName="extract-utilities" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.182336 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9818c7a-a4b7-414a-9dde-40b4951490f9" containerName="extract-utilities" Oct 04 04:01:00 crc kubenswrapper[4964]: E1004 04:01:00.182362 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9818c7a-a4b7-414a-9dde-40b4951490f9" containerName="extract-content" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.182373 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9818c7a-a4b7-414a-9dde-40b4951490f9" containerName="extract-content" Oct 04 04:01:00 crc kubenswrapper[4964]: E1004 04:01:00.182423 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bcd2df1-0186-4eba-aace-d725fc763d74" containerName="extract-content" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.182433 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bcd2df1-0186-4eba-aace-d725fc763d74" containerName="extract-content" Oct 04 04:01:00 crc kubenswrapper[4964]: E1004 04:01:00.182456 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bcd2df1-0186-4eba-aace-d725fc763d74" containerName="registry-server" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.182469 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bcd2df1-0186-4eba-aace-d725fc763d74" containerName="registry-server" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.185248 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bcd2df1-0186-4eba-aace-d725fc763d74" containerName="registry-server" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.185320 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9818c7a-a4b7-414a-9dde-40b4951490f9" containerName="registry-server" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.190979 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.190478 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29325841-5kllh"] Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.247806 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-config-data\") pod \"keystone-cron-29325841-5kllh\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.247885 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvldl\" (UniqueName: \"kubernetes.io/projected/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-kube-api-access-vvldl\") pod \"keystone-cron-29325841-5kllh\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.247979 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-combined-ca-bundle\") pod \"keystone-cron-29325841-5kllh\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.248037 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-fernet-keys\") pod \"keystone-cron-29325841-5kllh\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.350485 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-combined-ca-bundle\") pod \"keystone-cron-29325841-5kllh\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.350892 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-fernet-keys\") pod \"keystone-cron-29325841-5kllh\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.351127 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-config-data\") pod \"keystone-cron-29325841-5kllh\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.351282 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvldl\" (UniqueName: \"kubernetes.io/projected/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-kube-api-access-vvldl\") pod \"keystone-cron-29325841-5kllh\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.361226 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-combined-ca-bundle\") pod \"keystone-cron-29325841-5kllh\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.361356 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-fernet-keys\") pod \"keystone-cron-29325841-5kllh\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.362737 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-config-data\") pod \"keystone-cron-29325841-5kllh\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.376897 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvldl\" (UniqueName: \"kubernetes.io/projected/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-kube-api-access-vvldl\") pod \"keystone-cron-29325841-5kllh\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:00 crc kubenswrapper[4964]: I1004 04:01:00.536981 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:01 crc kubenswrapper[4964]: I1004 04:01:01.064103 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29325841-5kllh"] Oct 04 04:01:01 crc kubenswrapper[4964]: I1004 04:01:01.403576 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325841-5kllh" event={"ID":"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3","Type":"ContainerStarted","Data":"14c2f15c02431d221606b5d2d6bbb10b33e5b0452e606da64e599144d4a8fe79"} Oct 04 04:01:01 crc kubenswrapper[4964]: I1004 04:01:01.403665 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325841-5kllh" event={"ID":"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3","Type":"ContainerStarted","Data":"fd0bd1ca8122242d4c69e33392722dd76c449d5c434303fb408f6fc294158311"} Oct 04 04:01:02 crc kubenswrapper[4964]: I1004 04:01:02.448259 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29325841-5kllh" podStartSLOduration=2.448240612 podStartE2EDuration="2.448240612s" podCreationTimestamp="2025-10-04 04:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:01:02.445833728 +0000 UTC m=+4842.342792396" watchObservedRunningTime="2025-10-04 04:01:02.448240612 +0000 UTC m=+4842.345199270" Oct 04 04:01:05 crc kubenswrapper[4964]: I1004 04:01:05.455728 4964 generic.go:334] "Generic (PLEG): container finished" podID="70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3" containerID="14c2f15c02431d221606b5d2d6bbb10b33e5b0452e606da64e599144d4a8fe79" exitCode=0 Oct 04 04:01:05 crc kubenswrapper[4964]: I1004 04:01:05.456080 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325841-5kllh" event={"ID":"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3","Type":"ContainerDied","Data":"14c2f15c02431d221606b5d2d6bbb10b33e5b0452e606da64e599144d4a8fe79"} Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.818210 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.892399 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-fernet-keys\") pod \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.892732 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-config-data\") pod \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.892785 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvldl\" (UniqueName: \"kubernetes.io/projected/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-kube-api-access-vvldl\") pod \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.892817 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-combined-ca-bundle\") pod \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\" (UID: \"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3\") " Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.900557 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3" (UID: "70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.901154 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-kube-api-access-vvldl" (OuterVolumeSpecName: "kube-api-access-vvldl") pod "70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3" (UID: "70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3"). InnerVolumeSpecName "kube-api-access-vvldl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.929163 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3" (UID: "70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.956567 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-config-data" (OuterVolumeSpecName: "config-data") pod "70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3" (UID: "70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.995894 4964 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.995926 4964 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.995939 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvldl\" (UniqueName: \"kubernetes.io/projected/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-kube-api-access-vvldl\") on node \"crc\" DevicePath \"\"" Oct 04 04:01:06 crc kubenswrapper[4964]: I1004 04:01:06.995952 4964 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:01:07 crc kubenswrapper[4964]: I1004 04:01:07.475054 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325841-5kllh" event={"ID":"70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3","Type":"ContainerDied","Data":"fd0bd1ca8122242d4c69e33392722dd76c449d5c434303fb408f6fc294158311"} Oct 04 04:01:07 crc kubenswrapper[4964]: I1004 04:01:07.475091 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd0bd1ca8122242d4c69e33392722dd76c449d5c434303fb408f6fc294158311" Oct 04 04:01:07 crc kubenswrapper[4964]: I1004 04:01:07.475145 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325841-5kllh" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.735095 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vzs6p/must-gather-kzb6l"] Oct 04 04:01:15 crc kubenswrapper[4964]: E1004 04:01:15.735949 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3" containerName="keystone-cron" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.735963 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3" containerName="keystone-cron" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.736143 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3" containerName="keystone-cron" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.737081 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/must-gather-kzb6l" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.739547 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vzs6p"/"openshift-service-ca.crt" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.742886 4964 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vzs6p"/"kube-root-ca.crt" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.742956 4964 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vzs6p"/"default-dockercfg-hx5gw" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.744063 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vzs6p/must-gather-kzb6l"] Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.796028 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a215b0cf-24f2-4e4e-a1df-f6d947c27301-must-gather-output\") pod \"must-gather-kzb6l\" (UID: \"a215b0cf-24f2-4e4e-a1df-f6d947c27301\") " pod="openshift-must-gather-vzs6p/must-gather-kzb6l" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.796192 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhm55\" (UniqueName: \"kubernetes.io/projected/a215b0cf-24f2-4e4e-a1df-f6d947c27301-kube-api-access-jhm55\") pod \"must-gather-kzb6l\" (UID: \"a215b0cf-24f2-4e4e-a1df-f6d947c27301\") " pod="openshift-must-gather-vzs6p/must-gather-kzb6l" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.898637 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhm55\" (UniqueName: \"kubernetes.io/projected/a215b0cf-24f2-4e4e-a1df-f6d947c27301-kube-api-access-jhm55\") pod \"must-gather-kzb6l\" (UID: \"a215b0cf-24f2-4e4e-a1df-f6d947c27301\") " pod="openshift-must-gather-vzs6p/must-gather-kzb6l" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.898737 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a215b0cf-24f2-4e4e-a1df-f6d947c27301-must-gather-output\") pod \"must-gather-kzb6l\" (UID: \"a215b0cf-24f2-4e4e-a1df-f6d947c27301\") " pod="openshift-must-gather-vzs6p/must-gather-kzb6l" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.899224 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a215b0cf-24f2-4e4e-a1df-f6d947c27301-must-gather-output\") pod \"must-gather-kzb6l\" (UID: \"a215b0cf-24f2-4e4e-a1df-f6d947c27301\") " pod="openshift-must-gather-vzs6p/must-gather-kzb6l" Oct 04 04:01:15 crc kubenswrapper[4964]: I1004 04:01:15.929050 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhm55\" (UniqueName: \"kubernetes.io/projected/a215b0cf-24f2-4e4e-a1df-f6d947c27301-kube-api-access-jhm55\") pod \"must-gather-kzb6l\" (UID: \"a215b0cf-24f2-4e4e-a1df-f6d947c27301\") " pod="openshift-must-gather-vzs6p/must-gather-kzb6l" Oct 04 04:01:16 crc kubenswrapper[4964]: I1004 04:01:16.052229 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/must-gather-kzb6l" Oct 04 04:01:16 crc kubenswrapper[4964]: I1004 04:01:16.411105 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vzs6p/must-gather-kzb6l"] Oct 04 04:01:16 crc kubenswrapper[4964]: W1004 04:01:16.425111 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda215b0cf_24f2_4e4e_a1df_f6d947c27301.slice/crio-ec2d165a5442bfe208565c34388ea61d06c60f426ee4e158e875a02292a0efbb WatchSource:0}: Error finding container ec2d165a5442bfe208565c34388ea61d06c60f426ee4e158e875a02292a0efbb: Status 404 returned error can't find the container with id ec2d165a5442bfe208565c34388ea61d06c60f426ee4e158e875a02292a0efbb Oct 04 04:01:16 crc kubenswrapper[4964]: I1004 04:01:16.596964 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/must-gather-kzb6l" event={"ID":"a215b0cf-24f2-4e4e-a1df-f6d947c27301","Type":"ContainerStarted","Data":"ec2d165a5442bfe208565c34388ea61d06c60f426ee4e158e875a02292a0efbb"} Oct 04 04:01:22 crc kubenswrapper[4964]: I1004 04:01:22.677813 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/must-gather-kzb6l" event={"ID":"a215b0cf-24f2-4e4e-a1df-f6d947c27301","Type":"ContainerStarted","Data":"14f54e6c290ee526a013f17d64cdd8d74cecaf549a0008fab11f0d28a25f3399"} Oct 04 04:01:23 crc kubenswrapper[4964]: I1004 04:01:23.693123 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/must-gather-kzb6l" event={"ID":"a215b0cf-24f2-4e4e-a1df-f6d947c27301","Type":"ContainerStarted","Data":"6c893e9d1772d9f3ae5a3c078896e9cde869f4c34e7e2749297612ebb43c795f"} Oct 04 04:01:23 crc kubenswrapper[4964]: I1004 04:01:23.712229 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vzs6p/must-gather-kzb6l" podStartSLOduration=3.32327452 podStartE2EDuration="8.712216155s" podCreationTimestamp="2025-10-04 04:01:15 +0000 UTC" firstStartedPulling="2025-10-04 04:01:16.431553278 +0000 UTC m=+4856.328511916" lastFinishedPulling="2025-10-04 04:01:21.820494883 +0000 UTC m=+4861.717453551" observedRunningTime="2025-10-04 04:01:23.708820404 +0000 UTC m=+4863.605779082" watchObservedRunningTime="2025-10-04 04:01:23.712216155 +0000 UTC m=+4863.609174793" Oct 04 04:01:28 crc kubenswrapper[4964]: I1004 04:01:28.440578 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vzs6p/crc-debug-h5kmc"] Oct 04 04:01:28 crc kubenswrapper[4964]: I1004 04:01:28.442545 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" Oct 04 04:01:28 crc kubenswrapper[4964]: I1004 04:01:28.584014 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de91ce13-9dc7-4276-89cd-0dbb7d1762b0-host\") pod \"crc-debug-h5kmc\" (UID: \"de91ce13-9dc7-4276-89cd-0dbb7d1762b0\") " pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" Oct 04 04:01:28 crc kubenswrapper[4964]: I1004 04:01:28.584302 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbkc4\" (UniqueName: \"kubernetes.io/projected/de91ce13-9dc7-4276-89cd-0dbb7d1762b0-kube-api-access-kbkc4\") pod \"crc-debug-h5kmc\" (UID: \"de91ce13-9dc7-4276-89cd-0dbb7d1762b0\") " pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" Oct 04 04:01:28 crc kubenswrapper[4964]: I1004 04:01:28.687087 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de91ce13-9dc7-4276-89cd-0dbb7d1762b0-host\") pod \"crc-debug-h5kmc\" (UID: \"de91ce13-9dc7-4276-89cd-0dbb7d1762b0\") " pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" Oct 04 04:01:28 crc kubenswrapper[4964]: I1004 04:01:28.687424 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbkc4\" (UniqueName: \"kubernetes.io/projected/de91ce13-9dc7-4276-89cd-0dbb7d1762b0-kube-api-access-kbkc4\") pod \"crc-debug-h5kmc\" (UID: \"de91ce13-9dc7-4276-89cd-0dbb7d1762b0\") " pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" Oct 04 04:01:28 crc kubenswrapper[4964]: I1004 04:01:28.687235 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de91ce13-9dc7-4276-89cd-0dbb7d1762b0-host\") pod \"crc-debug-h5kmc\" (UID: \"de91ce13-9dc7-4276-89cd-0dbb7d1762b0\") " pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" Oct 04 04:01:28 crc kubenswrapper[4964]: I1004 04:01:28.709472 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbkc4\" (UniqueName: \"kubernetes.io/projected/de91ce13-9dc7-4276-89cd-0dbb7d1762b0-kube-api-access-kbkc4\") pod \"crc-debug-h5kmc\" (UID: \"de91ce13-9dc7-4276-89cd-0dbb7d1762b0\") " pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" Oct 04 04:01:28 crc kubenswrapper[4964]: I1004 04:01:28.759480 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" Oct 04 04:01:29 crc kubenswrapper[4964]: I1004 04:01:29.764276 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" event={"ID":"de91ce13-9dc7-4276-89cd-0dbb7d1762b0","Type":"ContainerStarted","Data":"99956bdfdf60ae04a2894d74d79db73038fe23bb4c619fa5a8a4614a34530c8b"} Oct 04 04:01:38 crc kubenswrapper[4964]: I1004 04:01:38.861904 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" event={"ID":"de91ce13-9dc7-4276-89cd-0dbb7d1762b0","Type":"ContainerStarted","Data":"3e7f31c851dd6bab9959f23ad41bed4f4d4148bfa5ec5f7e4c7f04f461bb5987"} Oct 04 04:01:38 crc kubenswrapper[4964]: I1004 04:01:38.880493 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" podStartSLOduration=1.739554426 podStartE2EDuration="10.880473628s" podCreationTimestamp="2025-10-04 04:01:28 +0000 UTC" firstStartedPulling="2025-10-04 04:01:28.789597896 +0000 UTC m=+4868.686556574" lastFinishedPulling="2025-10-04 04:01:37.930517138 +0000 UTC m=+4877.827475776" observedRunningTime="2025-10-04 04:01:38.880309644 +0000 UTC m=+4878.777268322" watchObservedRunningTime="2025-10-04 04:01:38.880473628 +0000 UTC m=+4878.777432306" Oct 04 04:02:04 crc kubenswrapper[4964]: I1004 04:02:04.462336 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:02:04 crc kubenswrapper[4964]: I1004 04:02:04.463014 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:02:33 crc kubenswrapper[4964]: I1004 04:02:33.362464 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56875d85c6-lvzcs_76608f85-e25d-4e88-b6da-93c51f75eba8/barbican-api/0.log" Oct 04 04:02:33 crc kubenswrapper[4964]: I1004 04:02:33.635813 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56875d85c6-lvzcs_76608f85-e25d-4e88-b6da-93c51f75eba8/barbican-api-log/0.log" Oct 04 04:02:33 crc kubenswrapper[4964]: I1004 04:02:33.757587 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6897cddb66-8b6jw_2fdccf76-1497-4e20-bae6-0eecb8f80d2f/barbican-keystone-listener/0.log" Oct 04 04:02:33 crc kubenswrapper[4964]: I1004 04:02:33.970014 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6897cddb66-8b6jw_2fdccf76-1497-4e20-bae6-0eecb8f80d2f/barbican-keystone-listener-log/0.log" Oct 04 04:02:34 crc kubenswrapper[4964]: I1004 04:02:34.107899 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-968687b55-z5p6q_8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0/barbican-worker/0.log" Oct 04 04:02:34 crc kubenswrapper[4964]: I1004 04:02:34.146444 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-968687b55-z5p6q_8eb1b2cd-e6d5-4aed-99d6-45f992a68cf0/barbican-worker-log/0.log" Oct 04 04:02:34 crc kubenswrapper[4964]: I1004 04:02:34.324236 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-lzhgd_ddd2309a-8d5d-4d8a-bc88-63eafcbcf7cb/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:34 crc kubenswrapper[4964]: I1004 04:02:34.449720 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:02:34 crc kubenswrapper[4964]: I1004 04:02:34.449793 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:02:34 crc kubenswrapper[4964]: I1004 04:02:34.618501 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1f15c57-eddd-4228-b863-b9e9cd1e3c71/ceilometer-central-agent/0.log" Oct 04 04:02:34 crc kubenswrapper[4964]: I1004 04:02:34.786834 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1f15c57-eddd-4228-b863-b9e9cd1e3c71/proxy-httpd/0.log" Oct 04 04:02:34 crc kubenswrapper[4964]: I1004 04:02:34.788015 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1f15c57-eddd-4228-b863-b9e9cd1e3c71/ceilometer-notification-agent/0.log" Oct 04 04:02:34 crc kubenswrapper[4964]: I1004 04:02:34.990760 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1f15c57-eddd-4228-b863-b9e9cd1e3c71/sg-core/0.log" Oct 04 04:02:35 crc kubenswrapper[4964]: I1004 04:02:35.173172 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-v7kjn_e1160df4-8a02-4583-8edb-acdb474fd9e0/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:35 crc kubenswrapper[4964]: I1004 04:02:35.373490 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-g7sdm_af8ff1f6-3d01-4248-b4eb-09415f50a4b1/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:36 crc kubenswrapper[4964]: I1004 04:02:36.066382 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1a05e5d9-10e1-44ab-88bf-c5e04a6af16c/cinder-api/0.log" Oct 04 04:02:36 crc kubenswrapper[4964]: I1004 04:02:36.230350 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1a05e5d9-10e1-44ab-88bf-c5e04a6af16c/cinder-api-log/0.log" Oct 04 04:02:36 crc kubenswrapper[4964]: I1004 04:02:36.287595 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_68b726b8-57ae-48e1-ba37-9e0be7cc3f79/probe/0.log" Oct 04 04:02:36 crc kubenswrapper[4964]: I1004 04:02:36.488343 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c8162366-d682-4f52-8402-4eff0411aae0/cinder-scheduler/0.log" Oct 04 04:02:36 crc kubenswrapper[4964]: I1004 04:02:36.714026 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c8162366-d682-4f52-8402-4eff0411aae0/probe/0.log" Oct 04 04:02:36 crc kubenswrapper[4964]: I1004 04:02:36.835840 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_68b726b8-57ae-48e1-ba37-9e0be7cc3f79/cinder-backup/0.log" Oct 04 04:02:36 crc kubenswrapper[4964]: I1004 04:02:36.933682 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_1eb33c04-b905-4472-839d-89537682be92/probe/0.log" Oct 04 04:02:37 crc kubenswrapper[4964]: I1004 04:02:37.122329 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-27h7d_40eeca54-e1ed-4769-b7cb-1cd5f7be08f3/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:37 crc kubenswrapper[4964]: I1004 04:02:37.316228 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-wnlck_31d3a8c6-81c5-4226-8bd9-be1d1a2e38a7/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:37 crc kubenswrapper[4964]: I1004 04:02:37.554800 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-wn8qh_8993dbbc-e0a2-46c8-b3e0-787dce0f121c/init/0.log" Oct 04 04:02:37 crc kubenswrapper[4964]: I1004 04:02:37.731854 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-wn8qh_8993dbbc-e0a2-46c8-b3e0-787dce0f121c/init/0.log" Oct 04 04:02:37 crc kubenswrapper[4964]: I1004 04:02:37.971140 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-wn8qh_8993dbbc-e0a2-46c8-b3e0-787dce0f121c/dnsmasq-dns/0.log" Oct 04 04:02:38 crc kubenswrapper[4964]: I1004 04:02:38.214551 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_36e18875-3436-4465-85fc-f0a240394665/glance-log/0.log" Oct 04 04:02:38 crc kubenswrapper[4964]: I1004 04:02:38.230160 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_36e18875-3436-4465-85fc-f0a240394665/glance-httpd/0.log" Oct 04 04:02:38 crc kubenswrapper[4964]: I1004 04:02:38.471063 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f0ea5ed8-e2bb-461c-9541-4e04e899684c/glance-httpd/0.log" Oct 04 04:02:38 crc kubenswrapper[4964]: I1004 04:02:38.687566 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f0ea5ed8-e2bb-461c-9541-4e04e899684c/glance-log/0.log" Oct 04 04:02:38 crc kubenswrapper[4964]: I1004 04:02:38.992192 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-64f9d99668-zvfzz_39f65132-4f7b-4c79-ba9b-e86c15ec60d6/horizon/0.log" Oct 04 04:02:39 crc kubenswrapper[4964]: I1004 04:02:39.311248 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-64f9d99668-zvfzz_39f65132-4f7b-4c79-ba9b-e86c15ec60d6/horizon-log/0.log" Oct 04 04:02:39 crc kubenswrapper[4964]: I1004 04:02:39.381634 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gt9lf_37d65c2b-caa2-47f0-917c-059ff235e83d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:39 crc kubenswrapper[4964]: I1004 04:02:39.599014 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-nvqkn_7243e183-f415-4ce5-9b98-6abb97122104/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:40 crc kubenswrapper[4964]: I1004 04:02:40.175713 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29325781-z42x5_4d97553d-eed6-464e-ac05-89e3591b8bb0/keystone-cron/0.log" Oct 04 04:02:40 crc kubenswrapper[4964]: I1004 04:02:40.545805 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29325841-5kllh_70f3b5ef-db47-47f7-a99f-d31cfaa2d5e3/keystone-cron/0.log" Oct 04 04:02:40 crc kubenswrapper[4964]: I1004 04:02:40.628646 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-58576f76db-7mmj9_0340ae15-3a89-4efd-b2e0-72b8ee3d2e6f/keystone-api/0.log" Oct 04 04:02:40 crc kubenswrapper[4964]: I1004 04:02:40.802245 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_82986d74-7878-4a81-b004-447c44700cd9/kube-state-metrics/0.log" Oct 04 04:02:41 crc kubenswrapper[4964]: I1004 04:02:41.063745 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-jj52g_31c841f6-d2c2-4557-8f03-ede13fec6dc0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:41 crc kubenswrapper[4964]: I1004 04:02:41.404774 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_e937ceaf-8613-4a23-a565-adafc14c8172/manila-api-log/0.log" Oct 04 04:02:41 crc kubenswrapper[4964]: I1004 04:02:41.406521 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_e937ceaf-8613-4a23-a565-adafc14c8172/manila-api/0.log" Oct 04 04:02:41 crc kubenswrapper[4964]: I1004 04:02:41.752390 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_0e479211-b081-4b46-93b3-e1ae824dd73a/manila-scheduler/0.log" Oct 04 04:02:41 crc kubenswrapper[4964]: I1004 04:02:41.808455 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_0e479211-b081-4b46-93b3-e1ae824dd73a/probe/0.log" Oct 04 04:02:42 crc kubenswrapper[4964]: I1004 04:02:42.017593 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8a194e94-6624-49cc-ba2a-19860c8c95bf/manila-share/0.log" Oct 04 04:02:42 crc kubenswrapper[4964]: I1004 04:02:42.401558 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8a194e94-6624-49cc-ba2a-19860c8c95bf/probe/0.log" Oct 04 04:02:42 crc kubenswrapper[4964]: I1004 04:02:42.789985 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_1eb33c04-b905-4472-839d-89537682be92/cinder-volume/0.log" Oct 04 04:02:43 crc kubenswrapper[4964]: I1004 04:02:43.176816 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76d58c8cb5-9l2w8_cce3503b-b92d-434a-b056-fa6832cff6d4/neutron-httpd/0.log" Oct 04 04:02:43 crc kubenswrapper[4964]: I1004 04:02:43.188676 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76d58c8cb5-9l2w8_cce3503b-b92d-434a-b056-fa6832cff6d4/neutron-api/0.log" Oct 04 04:02:43 crc kubenswrapper[4964]: I1004 04:02:43.435308 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-4txzb_1e86be92-e0c6-4e1d-ba37-28f6c32a08d5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:44 crc kubenswrapper[4964]: I1004 04:02:44.332791 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_32379b6f-b326-4de2-800d-09cd730119d4/nova-api-log/0.log" Oct 04 04:02:44 crc kubenswrapper[4964]: I1004 04:02:44.631949 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_32379b6f-b326-4de2-800d-09cd730119d4/nova-api-api/0.log" Oct 04 04:02:45 crc kubenswrapper[4964]: I1004 04:02:45.612955 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b1282768-6281-40a2-aaec-02d55f52579d/nova-cell0-conductor-conductor/0.log" Oct 04 04:02:45 crc kubenswrapper[4964]: I1004 04:02:45.762851 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1b760ccf-f6fc-4396-8504-38e08c6d1737/nova-cell1-conductor-conductor/0.log" Oct 04 04:02:46 crc kubenswrapper[4964]: I1004 04:02:46.046647 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e392b9b2-79f9-4814-8dc6-2966cae7a018/nova-cell1-novncproxy-novncproxy/0.log" Oct 04 04:02:46 crc kubenswrapper[4964]: I1004 04:02:46.491382 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-ljmds_2882ad3d-53fb-4ccf-aa3b-fe34165726a4/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:47 crc kubenswrapper[4964]: I1004 04:02:47.136844 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_fcacbb93-ef68-4fdf-a30f-a7cd458809ae/nova-metadata-log/0.log" Oct 04 04:02:47 crc kubenswrapper[4964]: I1004 04:02:47.826984 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_cd4751bc-b2dd-4323-83d8-45f639f1a72a/nova-scheduler-scheduler/0.log" Oct 04 04:02:48 crc kubenswrapper[4964]: I1004 04:02:48.253691 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6/mysql-bootstrap/0.log" Oct 04 04:02:48 crc kubenswrapper[4964]: I1004 04:02:48.475357 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6/mysql-bootstrap/0.log" Oct 04 04:02:48 crc kubenswrapper[4964]: I1004 04:02:48.697594 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0fd3a3a0-caf8-4e52-bb08-a2eea5b6c5f6/galera/0.log" Oct 04 04:02:49 crc kubenswrapper[4964]: I1004 04:02:49.316968 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_79d85912-9774-4b24-bacc-13feb8d11ca4/mysql-bootstrap/0.log" Oct 04 04:02:49 crc kubenswrapper[4964]: I1004 04:02:49.379893 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_fcacbb93-ef68-4fdf-a30f-a7cd458809ae/nova-metadata-metadata/0.log" Oct 04 04:02:49 crc kubenswrapper[4964]: I1004 04:02:49.506822 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_79d85912-9774-4b24-bacc-13feb8d11ca4/mysql-bootstrap/0.log" Oct 04 04:02:49 crc kubenswrapper[4964]: I1004 04:02:49.642388 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_79d85912-9774-4b24-bacc-13feb8d11ca4/galera/0.log" Oct 04 04:02:49 crc kubenswrapper[4964]: I1004 04:02:49.895828 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0e34537c-4b0f-4683-bf8a-3b56e44424b1/openstackclient/0.log" Oct 04 04:02:50 crc kubenswrapper[4964]: I1004 04:02:50.099049 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-c6kng_b37ef67d-6614-4b44-9435-a35a4939caf7/ovn-controller/0.log" Oct 04 04:02:50 crc kubenswrapper[4964]: I1004 04:02:50.269770 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-69tgq_6f812e77-519b-4703-8215-d16a2cb188dd/openstack-network-exporter/0.log" Oct 04 04:02:50 crc kubenswrapper[4964]: I1004 04:02:50.710465 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fj9vk_c45ccbc0-f08a-43ed-b80b-620fd961cb2d/ovsdb-server-init/0.log" Oct 04 04:02:50 crc kubenswrapper[4964]: I1004 04:02:50.900547 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fj9vk_c45ccbc0-f08a-43ed-b80b-620fd961cb2d/ovs-vswitchd/0.log" Oct 04 04:02:50 crc kubenswrapper[4964]: I1004 04:02:50.905892 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fj9vk_c45ccbc0-f08a-43ed-b80b-620fd961cb2d/ovsdb-server-init/0.log" Oct 04 04:02:51 crc kubenswrapper[4964]: I1004 04:02:51.078809 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fj9vk_c45ccbc0-f08a-43ed-b80b-620fd961cb2d/ovsdb-server/0.log" Oct 04 04:02:51 crc kubenswrapper[4964]: I1004 04:02:51.306984 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-84rt5_8b34ed53-2408-4cab-9c93-eed783a2f31c/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:51 crc kubenswrapper[4964]: I1004 04:02:51.494651 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4/openstack-network-exporter/0.log" Oct 04 04:02:51 crc kubenswrapper[4964]: I1004 04:02:51.580210 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e1e3c67f-6de6-4a44-a3a3-9ca24a141ac4/ovn-northd/0.log" Oct 04 04:02:51 crc kubenswrapper[4964]: I1004 04:02:51.782926 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2f1fa594-fc65-4b98-9fad-a3e2cb027eba/openstack-network-exporter/0.log" Oct 04 04:02:51 crc kubenswrapper[4964]: I1004 04:02:51.915054 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2f1fa594-fc65-4b98-9fad-a3e2cb027eba/ovsdbserver-nb/0.log" Oct 04 04:02:52 crc kubenswrapper[4964]: I1004 04:02:52.097325 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_71c6ccd3-a834-4a46-a25d-c92b7653c846/openstack-network-exporter/0.log" Oct 04 04:02:52 crc kubenswrapper[4964]: I1004 04:02:52.180676 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_71c6ccd3-a834-4a46-a25d-c92b7653c846/ovsdbserver-sb/0.log" Oct 04 04:02:52 crc kubenswrapper[4964]: I1004 04:02:52.485953 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-855ccd47c4-qrtzn_b18dfa90-2818-4164-a806-41cb55bb188c/placement-api/0.log" Oct 04 04:02:52 crc kubenswrapper[4964]: I1004 04:02:52.619822 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-855ccd47c4-qrtzn_b18dfa90-2818-4164-a806-41cb55bb188c/placement-log/0.log" Oct 04 04:02:52 crc kubenswrapper[4964]: I1004 04:02:52.849064 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8a662b31-7b7d-4491-bdc3-0b5c48b52f8c/setup-container/0.log" Oct 04 04:02:53 crc kubenswrapper[4964]: I1004 04:02:53.031443 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8a662b31-7b7d-4491-bdc3-0b5c48b52f8c/setup-container/0.log" Oct 04 04:02:53 crc kubenswrapper[4964]: I1004 04:02:53.108260 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8a662b31-7b7d-4491-bdc3-0b5c48b52f8c/rabbitmq/0.log" Oct 04 04:02:53 crc kubenswrapper[4964]: I1004 04:02:53.335341 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_96c7a0c3-f572-4493-b028-bcbafee4dd24/setup-container/0.log" Oct 04 04:02:53 crc kubenswrapper[4964]: I1004 04:02:53.548661 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_96c7a0c3-f572-4493-b028-bcbafee4dd24/setup-container/0.log" Oct 04 04:02:53 crc kubenswrapper[4964]: I1004 04:02:53.661257 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_96c7a0c3-f572-4493-b028-bcbafee4dd24/rabbitmq/0.log" Oct 04 04:02:54 crc kubenswrapper[4964]: I1004 04:02:54.102726 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-f77cj_aea639a4-f63d-46d5-abc4-d574ac966161/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:54 crc kubenswrapper[4964]: I1004 04:02:54.350336 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-rfxkg_1f74c64d-ceb1-4e0b-b1e3-dad667f2bd7a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:54 crc kubenswrapper[4964]: I1004 04:02:54.562659 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-qjzml_13993097-e1a1-4f3a-8e38-eee10934a97d/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:02:55 crc kubenswrapper[4964]: I1004 04:02:55.011455 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-8xppt_591298c0-563f-43b7-9291-9463885f7c6c/ssh-known-hosts-edpm-deployment/0.log" Oct 04 04:02:55 crc kubenswrapper[4964]: I1004 04:02:55.202589 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_a803f4c5-eef4-426a-acea-039c19405797/tempest-tests-tempest-tests-runner/0.log" Oct 04 04:02:55 crc kubenswrapper[4964]: I1004 04:02:55.237812 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_84508e92-e7bb-4e60-b695-cfdb38605416/test-operator-logs-container/0.log" Oct 04 04:02:55 crc kubenswrapper[4964]: I1004 04:02:55.664508 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-mwjct_d7c8421d-a33a-45b1-89f6-f2d57dc16ef6/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 04:03:04 crc kubenswrapper[4964]: I1004 04:03:04.448517 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:03:04 crc kubenswrapper[4964]: I1004 04:03:04.448975 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:03:04 crc kubenswrapper[4964]: I1004 04:03:04.449017 4964 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" Oct 04 04:03:04 crc kubenswrapper[4964]: I1004 04:03:04.449817 4964 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c"} pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 04:03:04 crc kubenswrapper[4964]: I1004 04:03:04.449869 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" containerID="cri-o://83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" gracePeriod=600 Oct 04 04:03:04 crc kubenswrapper[4964]: I1004 04:03:04.780214 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_6a557673-3117-4fd4-8f34-28e1b7541c9c/memcached/0.log" Oct 04 04:03:04 crc kubenswrapper[4964]: E1004 04:03:04.804727 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:03:05 crc kubenswrapper[4964]: I1004 04:03:05.680523 4964 generic.go:334] "Generic (PLEG): container finished" podID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" exitCode=0 Oct 04 04:03:05 crc kubenswrapper[4964]: I1004 04:03:05.680580 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerDied","Data":"83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c"} Oct 04 04:03:05 crc kubenswrapper[4964]: I1004 04:03:05.680643 4964 scope.go:117] "RemoveContainer" containerID="7acd97a2d943148214dec2a1e45b876bb35371b2538d34908344419f0e83dc2c" Oct 04 04:03:05 crc kubenswrapper[4964]: I1004 04:03:05.681264 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:03:05 crc kubenswrapper[4964]: E1004 04:03:05.681533 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:03:19 crc kubenswrapper[4964]: I1004 04:03:19.846522 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:03:19 crc kubenswrapper[4964]: E1004 04:03:19.847559 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:03:32 crc kubenswrapper[4964]: I1004 04:03:32.847225 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:03:32 crc kubenswrapper[4964]: E1004 04:03:32.848012 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:03:38 crc kubenswrapper[4964]: I1004 04:03:38.051071 4964 generic.go:334] "Generic (PLEG): container finished" podID="de91ce13-9dc7-4276-89cd-0dbb7d1762b0" containerID="3e7f31c851dd6bab9959f23ad41bed4f4d4148bfa5ec5f7e4c7f04f461bb5987" exitCode=0 Oct 04 04:03:38 crc kubenswrapper[4964]: I1004 04:03:38.051438 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" event={"ID":"de91ce13-9dc7-4276-89cd-0dbb7d1762b0","Type":"ContainerDied","Data":"3e7f31c851dd6bab9959f23ad41bed4f4d4148bfa5ec5f7e4c7f04f461bb5987"} Oct 04 04:03:39 crc kubenswrapper[4964]: I1004 04:03:39.216316 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" Oct 04 04:03:39 crc kubenswrapper[4964]: I1004 04:03:39.266069 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vzs6p/crc-debug-h5kmc"] Oct 04 04:03:39 crc kubenswrapper[4964]: I1004 04:03:39.278052 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vzs6p/crc-debug-h5kmc"] Oct 04 04:03:39 crc kubenswrapper[4964]: I1004 04:03:39.312402 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de91ce13-9dc7-4276-89cd-0dbb7d1762b0-host\") pod \"de91ce13-9dc7-4276-89cd-0dbb7d1762b0\" (UID: \"de91ce13-9dc7-4276-89cd-0dbb7d1762b0\") " Oct 04 04:03:39 crc kubenswrapper[4964]: I1004 04:03:39.312554 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de91ce13-9dc7-4276-89cd-0dbb7d1762b0-host" (OuterVolumeSpecName: "host") pod "de91ce13-9dc7-4276-89cd-0dbb7d1762b0" (UID: "de91ce13-9dc7-4276-89cd-0dbb7d1762b0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:03:39 crc kubenswrapper[4964]: I1004 04:03:39.312766 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbkc4\" (UniqueName: \"kubernetes.io/projected/de91ce13-9dc7-4276-89cd-0dbb7d1762b0-kube-api-access-kbkc4\") pod \"de91ce13-9dc7-4276-89cd-0dbb7d1762b0\" (UID: \"de91ce13-9dc7-4276-89cd-0dbb7d1762b0\") " Oct 04 04:03:39 crc kubenswrapper[4964]: I1004 04:03:39.313290 4964 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de91ce13-9dc7-4276-89cd-0dbb7d1762b0-host\") on node \"crc\" DevicePath \"\"" Oct 04 04:03:39 crc kubenswrapper[4964]: I1004 04:03:39.319048 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de91ce13-9dc7-4276-89cd-0dbb7d1762b0-kube-api-access-kbkc4" (OuterVolumeSpecName: "kube-api-access-kbkc4") pod "de91ce13-9dc7-4276-89cd-0dbb7d1762b0" (UID: "de91ce13-9dc7-4276-89cd-0dbb7d1762b0"). InnerVolumeSpecName "kube-api-access-kbkc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:03:39 crc kubenswrapper[4964]: I1004 04:03:39.415381 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbkc4\" (UniqueName: \"kubernetes.io/projected/de91ce13-9dc7-4276-89cd-0dbb7d1762b0-kube-api-access-kbkc4\") on node \"crc\" DevicePath \"\"" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.076233 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99956bdfdf60ae04a2894d74d79db73038fe23bb4c619fa5a8a4614a34530c8b" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.076344 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/crc-debug-h5kmc" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.445678 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vzs6p/crc-debug-fh5ck"] Oct 04 04:03:40 crc kubenswrapper[4964]: E1004 04:03:40.446162 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de91ce13-9dc7-4276-89cd-0dbb7d1762b0" containerName="container-00" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.446180 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="de91ce13-9dc7-4276-89cd-0dbb7d1762b0" containerName="container-00" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.446401 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="de91ce13-9dc7-4276-89cd-0dbb7d1762b0" containerName="container-00" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.447233 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.539305 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/736fbce8-314f-43a9-bcd1-9fe3f53774c3-host\") pod \"crc-debug-fh5ck\" (UID: \"736fbce8-314f-43a9-bcd1-9fe3f53774c3\") " pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.539396 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8b7l\" (UniqueName: \"kubernetes.io/projected/736fbce8-314f-43a9-bcd1-9fe3f53774c3-kube-api-access-z8b7l\") pod \"crc-debug-fh5ck\" (UID: \"736fbce8-314f-43a9-bcd1-9fe3f53774c3\") " pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.642428 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/736fbce8-314f-43a9-bcd1-9fe3f53774c3-host\") pod \"crc-debug-fh5ck\" (UID: \"736fbce8-314f-43a9-bcd1-9fe3f53774c3\") " pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.643020 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8b7l\" (UniqueName: \"kubernetes.io/projected/736fbce8-314f-43a9-bcd1-9fe3f53774c3-kube-api-access-z8b7l\") pod \"crc-debug-fh5ck\" (UID: \"736fbce8-314f-43a9-bcd1-9fe3f53774c3\") " pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.642672 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/736fbce8-314f-43a9-bcd1-9fe3f53774c3-host\") pod \"crc-debug-fh5ck\" (UID: \"736fbce8-314f-43a9-bcd1-9fe3f53774c3\") " pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.665436 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8b7l\" (UniqueName: \"kubernetes.io/projected/736fbce8-314f-43a9-bcd1-9fe3f53774c3-kube-api-access-z8b7l\") pod \"crc-debug-fh5ck\" (UID: \"736fbce8-314f-43a9-bcd1-9fe3f53774c3\") " pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.768823 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" Oct 04 04:03:40 crc kubenswrapper[4964]: I1004 04:03:40.857140 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de91ce13-9dc7-4276-89cd-0dbb7d1762b0" path="/var/lib/kubelet/pods/de91ce13-9dc7-4276-89cd-0dbb7d1762b0/volumes" Oct 04 04:03:41 crc kubenswrapper[4964]: I1004 04:03:41.104826 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" event={"ID":"736fbce8-314f-43a9-bcd1-9fe3f53774c3","Type":"ContainerStarted","Data":"c13304d3f0ebb4ef43972e37ab4b74b610072f943070326424412e6ec19ea4a0"} Oct 04 04:03:41 crc kubenswrapper[4964]: I1004 04:03:41.105191 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" event={"ID":"736fbce8-314f-43a9-bcd1-9fe3f53774c3","Type":"ContainerStarted","Data":"09208543f9e37dee2eec1b835a4bd0a8e878a8c0073626c3654673267adfd18b"} Oct 04 04:03:42 crc kubenswrapper[4964]: I1004 04:03:42.121171 4964 generic.go:334] "Generic (PLEG): container finished" podID="736fbce8-314f-43a9-bcd1-9fe3f53774c3" containerID="c13304d3f0ebb4ef43972e37ab4b74b610072f943070326424412e6ec19ea4a0" exitCode=0 Oct 04 04:03:42 crc kubenswrapper[4964]: I1004 04:03:42.121233 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" event={"ID":"736fbce8-314f-43a9-bcd1-9fe3f53774c3","Type":"ContainerDied","Data":"c13304d3f0ebb4ef43972e37ab4b74b610072f943070326424412e6ec19ea4a0"} Oct 04 04:03:43 crc kubenswrapper[4964]: I1004 04:03:43.219430 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" Oct 04 04:03:43 crc kubenswrapper[4964]: I1004 04:03:43.291999 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/736fbce8-314f-43a9-bcd1-9fe3f53774c3-host\") pod \"736fbce8-314f-43a9-bcd1-9fe3f53774c3\" (UID: \"736fbce8-314f-43a9-bcd1-9fe3f53774c3\") " Oct 04 04:03:43 crc kubenswrapper[4964]: I1004 04:03:43.292120 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8b7l\" (UniqueName: \"kubernetes.io/projected/736fbce8-314f-43a9-bcd1-9fe3f53774c3-kube-api-access-z8b7l\") pod \"736fbce8-314f-43a9-bcd1-9fe3f53774c3\" (UID: \"736fbce8-314f-43a9-bcd1-9fe3f53774c3\") " Oct 04 04:03:43 crc kubenswrapper[4964]: I1004 04:03:43.292131 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736fbce8-314f-43a9-bcd1-9fe3f53774c3-host" (OuterVolumeSpecName: "host") pod "736fbce8-314f-43a9-bcd1-9fe3f53774c3" (UID: "736fbce8-314f-43a9-bcd1-9fe3f53774c3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:03:43 crc kubenswrapper[4964]: I1004 04:03:43.292821 4964 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/736fbce8-314f-43a9-bcd1-9fe3f53774c3-host\") on node \"crc\" DevicePath \"\"" Oct 04 04:03:43 crc kubenswrapper[4964]: I1004 04:03:43.298219 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736fbce8-314f-43a9-bcd1-9fe3f53774c3-kube-api-access-z8b7l" (OuterVolumeSpecName: "kube-api-access-z8b7l") pod "736fbce8-314f-43a9-bcd1-9fe3f53774c3" (UID: "736fbce8-314f-43a9-bcd1-9fe3f53774c3"). InnerVolumeSpecName "kube-api-access-z8b7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:03:43 crc kubenswrapper[4964]: I1004 04:03:43.397892 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8b7l\" (UniqueName: \"kubernetes.io/projected/736fbce8-314f-43a9-bcd1-9fe3f53774c3-kube-api-access-z8b7l\") on node \"crc\" DevicePath \"\"" Oct 04 04:03:43 crc kubenswrapper[4964]: I1004 04:03:43.846512 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:03:43 crc kubenswrapper[4964]: E1004 04:03:43.846839 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:03:44 crc kubenswrapper[4964]: I1004 04:03:44.156496 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" event={"ID":"736fbce8-314f-43a9-bcd1-9fe3f53774c3","Type":"ContainerDied","Data":"09208543f9e37dee2eec1b835a4bd0a8e878a8c0073626c3654673267adfd18b"} Oct 04 04:03:44 crc kubenswrapper[4964]: I1004 04:03:44.156539 4964 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09208543f9e37dee2eec1b835a4bd0a8e878a8c0073626c3654673267adfd18b" Oct 04 04:03:44 crc kubenswrapper[4964]: I1004 04:03:44.156574 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/crc-debug-fh5ck" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.403223 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jtfn2"] Oct 04 04:03:49 crc kubenswrapper[4964]: E1004 04:03:49.404511 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736fbce8-314f-43a9-bcd1-9fe3f53774c3" containerName="container-00" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.404590 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="736fbce8-314f-43a9-bcd1-9fe3f53774c3" containerName="container-00" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.404816 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="736fbce8-314f-43a9-bcd1-9fe3f53774c3" containerName="container-00" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.406520 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.421567 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jtfn2"] Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.487926 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn2hp\" (UniqueName: \"kubernetes.io/projected/ed8a47d5-0d0c-41c4-bddb-39686a42483c-kube-api-access-nn2hp\") pod \"community-operators-jtfn2\" (UID: \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\") " pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.487975 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8a47d5-0d0c-41c4-bddb-39686a42483c-catalog-content\") pod \"community-operators-jtfn2\" (UID: \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\") " pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.487997 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8a47d5-0d0c-41c4-bddb-39686a42483c-utilities\") pod \"community-operators-jtfn2\" (UID: \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\") " pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.589172 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn2hp\" (UniqueName: \"kubernetes.io/projected/ed8a47d5-0d0c-41c4-bddb-39686a42483c-kube-api-access-nn2hp\") pod \"community-operators-jtfn2\" (UID: \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\") " pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.589221 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8a47d5-0d0c-41c4-bddb-39686a42483c-catalog-content\") pod \"community-operators-jtfn2\" (UID: \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\") " pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.589244 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8a47d5-0d0c-41c4-bddb-39686a42483c-utilities\") pod \"community-operators-jtfn2\" (UID: \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\") " pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.589738 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8a47d5-0d0c-41c4-bddb-39686a42483c-catalog-content\") pod \"community-operators-jtfn2\" (UID: \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\") " pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.589782 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8a47d5-0d0c-41c4-bddb-39686a42483c-utilities\") pod \"community-operators-jtfn2\" (UID: \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\") " pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.633784 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn2hp\" (UniqueName: \"kubernetes.io/projected/ed8a47d5-0d0c-41c4-bddb-39686a42483c-kube-api-access-nn2hp\") pod \"community-operators-jtfn2\" (UID: \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\") " pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:49 crc kubenswrapper[4964]: I1004 04:03:49.729980 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:50 crc kubenswrapper[4964]: I1004 04:03:50.287844 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jtfn2"] Oct 04 04:03:51 crc kubenswrapper[4964]: I1004 04:03:51.217565 4964 generic.go:334] "Generic (PLEG): container finished" podID="ed8a47d5-0d0c-41c4-bddb-39686a42483c" containerID="82a86bdc0242e353cb68ff7cf6067f528b9b3597881272926d112bc013bff8f9" exitCode=0 Oct 04 04:03:51 crc kubenswrapper[4964]: I1004 04:03:51.217854 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtfn2" event={"ID":"ed8a47d5-0d0c-41c4-bddb-39686a42483c","Type":"ContainerDied","Data":"82a86bdc0242e353cb68ff7cf6067f528b9b3597881272926d112bc013bff8f9"} Oct 04 04:03:51 crc kubenswrapper[4964]: I1004 04:03:51.217879 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtfn2" event={"ID":"ed8a47d5-0d0c-41c4-bddb-39686a42483c","Type":"ContainerStarted","Data":"39817571488bdf9c2299bbac58d571a7d504f9e21e186700ba39e04aabc09f0a"} Oct 04 04:03:51 crc kubenswrapper[4964]: I1004 04:03:51.497028 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vzs6p/crc-debug-fh5ck"] Oct 04 04:03:51 crc kubenswrapper[4964]: I1004 04:03:51.506187 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vzs6p/crc-debug-fh5ck"] Oct 04 04:03:52 crc kubenswrapper[4964]: I1004 04:03:52.766531 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vzs6p/crc-debug-gnjdl"] Oct 04 04:03:52 crc kubenswrapper[4964]: I1004 04:03:52.768966 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/crc-debug-gnjdl" Oct 04 04:03:52 crc kubenswrapper[4964]: I1004 04:03:52.859550 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d-host\") pod \"crc-debug-gnjdl\" (UID: \"4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d\") " pod="openshift-must-gather-vzs6p/crc-debug-gnjdl" Oct 04 04:03:52 crc kubenswrapper[4964]: I1004 04:03:52.859685 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9cvw\" (UniqueName: \"kubernetes.io/projected/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d-kube-api-access-g9cvw\") pod \"crc-debug-gnjdl\" (UID: \"4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d\") " pod="openshift-must-gather-vzs6p/crc-debug-gnjdl" Oct 04 04:03:52 crc kubenswrapper[4964]: I1004 04:03:52.864704 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736fbce8-314f-43a9-bcd1-9fe3f53774c3" path="/var/lib/kubelet/pods/736fbce8-314f-43a9-bcd1-9fe3f53774c3/volumes" Oct 04 04:03:52 crc kubenswrapper[4964]: I1004 04:03:52.963652 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d-host\") pod \"crc-debug-gnjdl\" (UID: \"4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d\") " pod="openshift-must-gather-vzs6p/crc-debug-gnjdl" Oct 04 04:03:52 crc kubenswrapper[4964]: I1004 04:03:52.964018 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d-host\") pod \"crc-debug-gnjdl\" (UID: \"4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d\") " pod="openshift-must-gather-vzs6p/crc-debug-gnjdl" Oct 04 04:03:52 crc kubenswrapper[4964]: I1004 04:03:52.964235 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9cvw\" (UniqueName: \"kubernetes.io/projected/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d-kube-api-access-g9cvw\") pod \"crc-debug-gnjdl\" (UID: \"4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d\") " pod="openshift-must-gather-vzs6p/crc-debug-gnjdl" Oct 04 04:03:53 crc kubenswrapper[4964]: I1004 04:03:53.787949 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9cvw\" (UniqueName: \"kubernetes.io/projected/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d-kube-api-access-g9cvw\") pod \"crc-debug-gnjdl\" (UID: \"4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d\") " pod="openshift-must-gather-vzs6p/crc-debug-gnjdl" Oct 04 04:03:54 crc kubenswrapper[4964]: I1004 04:03:54.016924 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/crc-debug-gnjdl" Oct 04 04:03:54 crc kubenswrapper[4964]: I1004 04:03:54.249097 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/crc-debug-gnjdl" event={"ID":"4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d","Type":"ContainerStarted","Data":"58477268e3b67c29d6083648e2acf35c75b356db1a837ac4d7a179dc4e1183ce"} Oct 04 04:03:54 crc kubenswrapper[4964]: I1004 04:03:54.252374 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtfn2" event={"ID":"ed8a47d5-0d0c-41c4-bddb-39686a42483c","Type":"ContainerStarted","Data":"06f869012d5567a5a401e24fbcbde0566acdde0c3f02108468b599ac47cbef6e"} Oct 04 04:03:55 crc kubenswrapper[4964]: I1004 04:03:55.267666 4964 generic.go:334] "Generic (PLEG): container finished" podID="4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d" containerID="0aab7671f0ad4e9fe3952b8d341e28621313f431bc33f59793e33d8d931872a3" exitCode=0 Oct 04 04:03:55 crc kubenswrapper[4964]: I1004 04:03:55.267992 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/crc-debug-gnjdl" event={"ID":"4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d","Type":"ContainerDied","Data":"0aab7671f0ad4e9fe3952b8d341e28621313f431bc33f59793e33d8d931872a3"} Oct 04 04:03:55 crc kubenswrapper[4964]: I1004 04:03:55.331899 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vzs6p/crc-debug-gnjdl"] Oct 04 04:03:55 crc kubenswrapper[4964]: I1004 04:03:55.340495 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vzs6p/crc-debug-gnjdl"] Oct 04 04:03:55 crc kubenswrapper[4964]: I1004 04:03:55.846164 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:03:55 crc kubenswrapper[4964]: E1004 04:03:55.846766 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:03:56 crc kubenswrapper[4964]: I1004 04:03:56.292364 4964 generic.go:334] "Generic (PLEG): container finished" podID="ed8a47d5-0d0c-41c4-bddb-39686a42483c" containerID="06f869012d5567a5a401e24fbcbde0566acdde0c3f02108468b599ac47cbef6e" exitCode=0 Oct 04 04:03:56 crc kubenswrapper[4964]: I1004 04:03:56.292586 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtfn2" event={"ID":"ed8a47d5-0d0c-41c4-bddb-39686a42483c","Type":"ContainerDied","Data":"06f869012d5567a5a401e24fbcbde0566acdde0c3f02108468b599ac47cbef6e"} Oct 04 04:03:56 crc kubenswrapper[4964]: I1004 04:03:56.394312 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/crc-debug-gnjdl" Oct 04 04:03:56 crc kubenswrapper[4964]: I1004 04:03:56.448301 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d-host\") pod \"4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d\" (UID: \"4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d\") " Oct 04 04:03:56 crc kubenswrapper[4964]: I1004 04:03:56.448433 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d-host" (OuterVolumeSpecName: "host") pod "4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d" (UID: "4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:03:56 crc kubenswrapper[4964]: I1004 04:03:56.448467 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9cvw\" (UniqueName: \"kubernetes.io/projected/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d-kube-api-access-g9cvw\") pod \"4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d\" (UID: \"4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d\") " Oct 04 04:03:56 crc kubenswrapper[4964]: I1004 04:03:56.449258 4964 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d-host\") on node \"crc\" DevicePath \"\"" Oct 04 04:03:56 crc kubenswrapper[4964]: I1004 04:03:56.457857 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d-kube-api-access-g9cvw" (OuterVolumeSpecName: "kube-api-access-g9cvw") pod "4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d" (UID: "4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d"). InnerVolumeSpecName "kube-api-access-g9cvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:03:56 crc kubenswrapper[4964]: I1004 04:03:56.550580 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9cvw\" (UniqueName: \"kubernetes.io/projected/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d-kube-api-access-g9cvw\") on node \"crc\" DevicePath \"\"" Oct 04 04:03:56 crc kubenswrapper[4964]: I1004 04:03:56.865309 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d" path="/var/lib/kubelet/pods/4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d/volumes" Oct 04 04:03:57 crc kubenswrapper[4964]: I1004 04:03:57.307384 4964 scope.go:117] "RemoveContainer" containerID="0aab7671f0ad4e9fe3952b8d341e28621313f431bc33f59793e33d8d931872a3" Oct 04 04:03:57 crc kubenswrapper[4964]: I1004 04:03:57.307558 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/crc-debug-gnjdl" Oct 04 04:03:57 crc kubenswrapper[4964]: I1004 04:03:57.468122 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8_f508dab2-f748-4821-a9c9-c06405b3ecd5/util/0.log" Oct 04 04:03:57 crc kubenswrapper[4964]: I1004 04:03:57.568124 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8_f508dab2-f748-4821-a9c9-c06405b3ecd5/util/0.log" Oct 04 04:03:57 crc kubenswrapper[4964]: I1004 04:03:57.713886 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8_f508dab2-f748-4821-a9c9-c06405b3ecd5/pull/0.log" Oct 04 04:03:57 crc kubenswrapper[4964]: I1004 04:03:57.713937 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8_f508dab2-f748-4821-a9c9-c06405b3ecd5/pull/0.log" Oct 04 04:03:57 crc kubenswrapper[4964]: I1004 04:03:57.857599 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8_f508dab2-f748-4821-a9c9-c06405b3ecd5/util/0.log" Oct 04 04:03:57 crc kubenswrapper[4964]: I1004 04:03:57.861764 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8_f508dab2-f748-4821-a9c9-c06405b3ecd5/pull/0.log" Oct 04 04:03:57 crc kubenswrapper[4964]: I1004 04:03:57.952440 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_22c3d4ceef8cd160b6fb76e8b95937356362d0ffbc24cf9220930228a5wcgm8_f508dab2-f748-4821-a9c9-c06405b3ecd5/extract/0.log" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.050366 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-kcwbj_a9d92a9d-3e4a-4945-b449-4aed29708295/kube-rbac-proxy/0.log" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.161019 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-kcwbj_a9d92a9d-3e4a-4945-b449-4aed29708295/manager/0.log" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.260056 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55cd88dfc-q48nc_35f77ec9-6ace-4f9c-ad47-6956e222902b/kube-rbac-proxy/0.log" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.345502 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtfn2" event={"ID":"ed8a47d5-0d0c-41c4-bddb-39686a42483c","Type":"ContainerStarted","Data":"2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81"} Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.374536 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jtfn2" podStartSLOduration=3.338383254 podStartE2EDuration="9.374516919s" podCreationTimestamp="2025-10-04 04:03:49 +0000 UTC" firstStartedPulling="2025-10-04 04:03:51.219675557 +0000 UTC m=+5011.116634195" lastFinishedPulling="2025-10-04 04:03:57.255809212 +0000 UTC m=+5017.152767860" observedRunningTime="2025-10-04 04:03:58.364740708 +0000 UTC m=+5018.261699346" watchObservedRunningTime="2025-10-04 04:03:58.374516919 +0000 UTC m=+5018.271475557" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.413850 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55cd88dfc-q48nc_35f77ec9-6ace-4f9c-ad47-6956e222902b/manager/0.log" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.416520 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-q7zh9_082a9114-18e1-40d0-829e-f2758614e49b/kube-rbac-proxy/0.log" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.492316 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-q7zh9_082a9114-18e1-40d0-829e-f2758614e49b/manager/0.log" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.584037 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-mbdfq_71f63b59-61c1-43ae-8726-bdc38806ee71/kube-rbac-proxy/0.log" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.692751 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-mbdfq_71f63b59-61c1-43ae-8726-bdc38806ee71/manager/0.log" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.770427 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-qpmdv_5d80f2c1-5570-48f8-908f-d580f7c7ecc7/kube-rbac-proxy/0.log" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.869679 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-qpmdv_5d80f2c1-5570-48f8-908f-d580f7c7ecc7/manager/0.log" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.927091 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-8f8rc_7a706485-5c1c-4f12-854b-779a385023fe/kube-rbac-proxy/0.log" Oct 04 04:03:58 crc kubenswrapper[4964]: I1004 04:03:58.976330 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-8f8rc_7a706485-5c1c-4f12-854b-779a385023fe/manager/0.log" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.097577 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-n5sks_892610de-e4c4-4b99-a0ca-07fc0ad63df2/kube-rbac-proxy/0.log" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.233499 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-n5sks_892610de-e4c4-4b99-a0ca-07fc0ad63df2/manager/0.log" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.259511 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-99sf8_7272252c-6b3a-4680-9f59-37bc87154be8/kube-rbac-proxy/0.log" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.303047 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-99sf8_7272252c-6b3a-4680-9f59-37bc87154be8/manager/0.log" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.416097 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-frmcj_1ac8bb69-05a0-4faa-a294-5243e4a2e21a/kube-rbac-proxy/0.log" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.520702 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-frmcj_1ac8bb69-05a0-4faa-a294-5243e4a2e21a/manager/0.log" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.625584 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-sz72v_be3f98e4-03d2-46bb-b7fe-bc050255934c/kube-rbac-proxy/0.log" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.687401 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh_aab9e95e-6af6-483a-9cef-96a4accd24f9/kube-rbac-proxy/0.log" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.720387 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-sz72v_be3f98e4-03d2-46bb-b7fe-bc050255934c/manager/0.log" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.730609 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.730685 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.785840 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.822709 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-4lwvh_aab9e95e-6af6-483a-9cef-96a4accd24f9/manager/0.log" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.909159 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-85ckj_4c3ecd12-c2c1-48ba-b75d-e1df6fcb7a4a/kube-rbac-proxy/0.log" Oct 04 04:03:59 crc kubenswrapper[4964]: I1004 04:03:59.969517 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-85ckj_4c3ecd12-c2c1-48ba-b75d-e1df6fcb7a4a/manager/0.log" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.044463 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-nkxrt_e756e118-8e7e-4e1f-827d-cef4acdbb848/kube-rbac-proxy/0.log" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.175214 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-nkxrt_e756e118-8e7e-4e1f-827d-cef4acdbb848/manager/0.log" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.213536 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-strsv_2611c21b-338e-4dc0-b977-e15067937730/kube-rbac-proxy/0.log" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.249555 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-strsv_2611c21b-338e-4dc0-b977-e15067937730/manager/0.log" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.362994 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm_762eb95d-bd98-4d86-8fc8-404234c0a13e/kube-rbac-proxy/0.log" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.404908 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c9f7qm_762eb95d-bd98-4d86-8fc8-404234c0a13e/manager/0.log" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.511388 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kfhbd"] Oct 04 04:04:00 crc kubenswrapper[4964]: E1004 04:04:00.513962 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d" containerName="container-00" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.514285 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d" containerName="container-00" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.514564 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa108e3-f5bc-485c-ae9b-32da7b3c1a7d" containerName="container-00" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.516295 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.523776 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfhbd"] Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.552128 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-545dfb464d-g5kjf_4274bd25-ba07-4036-80b8-86561c1a6f64/kube-rbac-proxy/0.log" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.625831 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5798c59cb8-rgpft_939d1a8b-d019-424e-bf3a-601213f46341/kube-rbac-proxy/0.log" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.639476 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv282\" (UniqueName: \"kubernetes.io/projected/91992027-36e9-4b42-b5a2-c1291be864cb-kube-api-access-zv282\") pod \"certified-operators-kfhbd\" (UID: \"91992027-36e9-4b42-b5a2-c1291be864cb\") " pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.639577 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91992027-36e9-4b42-b5a2-c1291be864cb-utilities\") pod \"certified-operators-kfhbd\" (UID: \"91992027-36e9-4b42-b5a2-c1291be864cb\") " pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.639605 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91992027-36e9-4b42-b5a2-c1291be864cb-catalog-content\") pod \"certified-operators-kfhbd\" (UID: \"91992027-36e9-4b42-b5a2-c1291be864cb\") " pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.740730 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91992027-36e9-4b42-b5a2-c1291be864cb-catalog-content\") pod \"certified-operators-kfhbd\" (UID: \"91992027-36e9-4b42-b5a2-c1291be864cb\") " pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.740877 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv282\" (UniqueName: \"kubernetes.io/projected/91992027-36e9-4b42-b5a2-c1291be864cb-kube-api-access-zv282\") pod \"certified-operators-kfhbd\" (UID: \"91992027-36e9-4b42-b5a2-c1291be864cb\") " pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.740951 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91992027-36e9-4b42-b5a2-c1291be864cb-utilities\") pod \"certified-operators-kfhbd\" (UID: \"91992027-36e9-4b42-b5a2-c1291be864cb\") " pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.741467 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91992027-36e9-4b42-b5a2-c1291be864cb-utilities\") pod \"certified-operators-kfhbd\" (UID: \"91992027-36e9-4b42-b5a2-c1291be864cb\") " pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.741492 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91992027-36e9-4b42-b5a2-c1291be864cb-catalog-content\") pod \"certified-operators-kfhbd\" (UID: \"91992027-36e9-4b42-b5a2-c1291be864cb\") " pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.773318 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv282\" (UniqueName: \"kubernetes.io/projected/91992027-36e9-4b42-b5a2-c1291be864cb-kube-api-access-zv282\") pod \"certified-operators-kfhbd\" (UID: \"91992027-36e9-4b42-b5a2-c1291be864cb\") " pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.820562 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5798c59cb8-rgpft_939d1a8b-d019-424e-bf3a-601213f46341/operator/0.log" Oct 04 04:04:00 crc kubenswrapper[4964]: I1004 04:04:00.845016 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.062870 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-l59hf_a72faf85-343c-4d46-8773-97a366ed031a/registry-server/0.log" Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.224680 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-qgrnf_b4751629-75d2-4c2a-afb5-a7b7915cb644/kube-rbac-proxy/0.log" Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.296875 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-qgrnf_b4751629-75d2-4c2a-afb5-a7b7915cb644/manager/0.log" Oct 04 04:04:01 crc kubenswrapper[4964]: W1004 04:04:01.371761 4964 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91992027_36e9_4b42_b5a2_c1291be864cb.slice/crio-91d80a388f47baac12d44866d26c838da6f3ae057a6312cbc32018c9a6651aeb WatchSource:0}: Error finding container 91d80a388f47baac12d44866d26c838da6f3ae057a6312cbc32018c9a6651aeb: Status 404 returned error can't find the container with id 91d80a388f47baac12d44866d26c838da6f3ae057a6312cbc32018c9a6651aeb Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.382042 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfhbd"] Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.382567 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-wlzgd_3fa08171-beb2-42d4-a751-fe46eb179a70/kube-rbac-proxy/0.log" Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.526269 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-wlzgd_3fa08171-beb2-42d4-a751-fe46eb179a70/manager/0.log" Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.550484 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-fvhgm_4b675799-2c4a-4167-bd37-0de27bc8861d/operator/0.log" Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.728489 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-dzqh8_c1244d8a-d20c-4318-9dfd-3617e35e54e9/kube-rbac-proxy/0.log" Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.752114 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-dzqh8_c1244d8a-d20c-4318-9dfd-3617e35e54e9/manager/0.log" Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.826502 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-545dfb464d-g5kjf_4274bd25-ba07-4036-80b8-86561c1a6f64/manager/0.log" Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.833923 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-hkmlt_5ba7848e-5d4e-4de0-a0de-2a8bcd534c90/kube-rbac-proxy/0.log" Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.959984 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-hkmlt_5ba7848e-5d4e-4de0-a0de-2a8bcd534c90/manager/0.log" Oct 04 04:04:01 crc kubenswrapper[4964]: I1004 04:04:01.998481 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-vrfm9_36a7b704-074e-4b3c-a459-e55607c9f604/kube-rbac-proxy/0.log" Oct 04 04:04:02 crc kubenswrapper[4964]: I1004 04:04:02.006552 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-vrfm9_36a7b704-074e-4b3c-a459-e55607c9f604/manager/0.log" Oct 04 04:04:02 crc kubenswrapper[4964]: I1004 04:04:02.128285 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-frmfp_e317dfbb-55c5-49ee-8e16-5bc0532e2dfb/kube-rbac-proxy/0.log" Oct 04 04:04:02 crc kubenswrapper[4964]: I1004 04:04:02.129857 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-frmfp_e317dfbb-55c5-49ee-8e16-5bc0532e2dfb/manager/0.log" Oct 04 04:04:02 crc kubenswrapper[4964]: I1004 04:04:02.384296 4964 generic.go:334] "Generic (PLEG): container finished" podID="91992027-36e9-4b42-b5a2-c1291be864cb" containerID="bc55b671996d18070a90ca882b94f4b2fee77c78222a2d77508e4311f26a2fd8" exitCode=0 Oct 04 04:04:02 crc kubenswrapper[4964]: I1004 04:04:02.384550 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfhbd" event={"ID":"91992027-36e9-4b42-b5a2-c1291be864cb","Type":"ContainerDied","Data":"bc55b671996d18070a90ca882b94f4b2fee77c78222a2d77508e4311f26a2fd8"} Oct 04 04:04:02 crc kubenswrapper[4964]: I1004 04:04:02.384575 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfhbd" event={"ID":"91992027-36e9-4b42-b5a2-c1291be864cb","Type":"ContainerStarted","Data":"91d80a388f47baac12d44866d26c838da6f3ae057a6312cbc32018c9a6651aeb"} Oct 04 04:04:04 crc kubenswrapper[4964]: I1004 04:04:04.408326 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfhbd" event={"ID":"91992027-36e9-4b42-b5a2-c1291be864cb","Type":"ContainerStarted","Data":"e4c4ff1e3839940065535ae2ad70e8293707b5cc927e9065c8c0311175021c9b"} Oct 04 04:04:05 crc kubenswrapper[4964]: I1004 04:04:05.418194 4964 generic.go:334] "Generic (PLEG): container finished" podID="91992027-36e9-4b42-b5a2-c1291be864cb" containerID="e4c4ff1e3839940065535ae2ad70e8293707b5cc927e9065c8c0311175021c9b" exitCode=0 Oct 04 04:04:05 crc kubenswrapper[4964]: I1004 04:04:05.418355 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfhbd" event={"ID":"91992027-36e9-4b42-b5a2-c1291be864cb","Type":"ContainerDied","Data":"e4c4ff1e3839940065535ae2ad70e8293707b5cc927e9065c8c0311175021c9b"} Oct 04 04:04:06 crc kubenswrapper[4964]: I1004 04:04:06.427350 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfhbd" event={"ID":"91992027-36e9-4b42-b5a2-c1291be864cb","Type":"ContainerStarted","Data":"d8a2cf7e5db0528d46a6140cf71d7c5aa351f24d1e7920b058a1be1611484f16"} Oct 04 04:04:06 crc kubenswrapper[4964]: I1004 04:04:06.473148 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kfhbd" podStartSLOduration=3.0367414 podStartE2EDuration="6.473131486s" podCreationTimestamp="2025-10-04 04:04:00 +0000 UTC" firstStartedPulling="2025-10-04 04:04:02.387434964 +0000 UTC m=+5022.284393612" lastFinishedPulling="2025-10-04 04:04:05.82382506 +0000 UTC m=+5025.720783698" observedRunningTime="2025-10-04 04:04:06.462809951 +0000 UTC m=+5026.359768589" watchObservedRunningTime="2025-10-04 04:04:06.473131486 +0000 UTC m=+5026.370090124" Oct 04 04:04:09 crc kubenswrapper[4964]: I1004 04:04:09.795679 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:04:09 crc kubenswrapper[4964]: I1004 04:04:09.845272 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:04:09 crc kubenswrapper[4964]: E1004 04:04:09.845758 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:04:09 crc kubenswrapper[4964]: I1004 04:04:09.852241 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jtfn2"] Oct 04 04:04:10 crc kubenswrapper[4964]: I1004 04:04:10.457504 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jtfn2" podUID="ed8a47d5-0d0c-41c4-bddb-39686a42483c" containerName="registry-server" containerID="cri-o://2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81" gracePeriod=2 Oct 04 04:04:10 crc kubenswrapper[4964]: I1004 04:04:10.861231 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:10 crc kubenswrapper[4964]: I1004 04:04:10.861589 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:10 crc kubenswrapper[4964]: I1004 04:04:10.907378 4964 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:10 crc kubenswrapper[4964]: I1004 04:04:10.960427 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.079363 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn2hp\" (UniqueName: \"kubernetes.io/projected/ed8a47d5-0d0c-41c4-bddb-39686a42483c-kube-api-access-nn2hp\") pod \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\" (UID: \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\") " Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.079460 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8a47d5-0d0c-41c4-bddb-39686a42483c-catalog-content\") pod \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\" (UID: \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\") " Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.079510 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8a47d5-0d0c-41c4-bddb-39686a42483c-utilities\") pod \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\" (UID: \"ed8a47d5-0d0c-41c4-bddb-39686a42483c\") " Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.080712 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed8a47d5-0d0c-41c4-bddb-39686a42483c-utilities" (OuterVolumeSpecName: "utilities") pod "ed8a47d5-0d0c-41c4-bddb-39686a42483c" (UID: "ed8a47d5-0d0c-41c4-bddb-39686a42483c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.085496 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8a47d5-0d0c-41c4-bddb-39686a42483c-kube-api-access-nn2hp" (OuterVolumeSpecName: "kube-api-access-nn2hp") pod "ed8a47d5-0d0c-41c4-bddb-39686a42483c" (UID: "ed8a47d5-0d0c-41c4-bddb-39686a42483c"). InnerVolumeSpecName "kube-api-access-nn2hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.134162 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed8a47d5-0d0c-41c4-bddb-39686a42483c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed8a47d5-0d0c-41c4-bddb-39686a42483c" (UID: "ed8a47d5-0d0c-41c4-bddb-39686a42483c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.182091 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8a47d5-0d0c-41c4-bddb-39686a42483c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.182596 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8a47d5-0d0c-41c4-bddb-39686a42483c-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.182672 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn2hp\" (UniqueName: \"kubernetes.io/projected/ed8a47d5-0d0c-41c4-bddb-39686a42483c-kube-api-access-nn2hp\") on node \"crc\" DevicePath \"\"" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.469726 4964 generic.go:334] "Generic (PLEG): container finished" podID="ed8a47d5-0d0c-41c4-bddb-39686a42483c" containerID="2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81" exitCode=0 Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.469802 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtfn2" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.469847 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtfn2" event={"ID":"ed8a47d5-0d0c-41c4-bddb-39686a42483c","Type":"ContainerDied","Data":"2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81"} Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.469909 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtfn2" event={"ID":"ed8a47d5-0d0c-41c4-bddb-39686a42483c","Type":"ContainerDied","Data":"39817571488bdf9c2299bbac58d571a7d504f9e21e186700ba39e04aabc09f0a"} Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.469942 4964 scope.go:117] "RemoveContainer" containerID="2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.493937 4964 scope.go:117] "RemoveContainer" containerID="06f869012d5567a5a401e24fbcbde0566acdde0c3f02108468b599ac47cbef6e" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.538831 4964 scope.go:117] "RemoveContainer" containerID="82a86bdc0242e353cb68ff7cf6067f528b9b3597881272926d112bc013bff8f9" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.541720 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jtfn2"] Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.542853 4964 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.566082 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jtfn2"] Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.574113 4964 scope.go:117] "RemoveContainer" containerID="2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81" Oct 04 04:04:11 crc kubenswrapper[4964]: E1004 04:04:11.574582 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81\": container with ID starting with 2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81 not found: ID does not exist" containerID="2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.574640 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81"} err="failed to get container status \"2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81\": rpc error: code = NotFound desc = could not find container \"2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81\": container with ID starting with 2decb8eff8b9ef1b705b57c39b26a1ccbba5cf8b994b7d71e3b9cd1dc260bf81 not found: ID does not exist" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.574670 4964 scope.go:117] "RemoveContainer" containerID="06f869012d5567a5a401e24fbcbde0566acdde0c3f02108468b599ac47cbef6e" Oct 04 04:04:11 crc kubenswrapper[4964]: E1004 04:04:11.574993 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06f869012d5567a5a401e24fbcbde0566acdde0c3f02108468b599ac47cbef6e\": container with ID starting with 06f869012d5567a5a401e24fbcbde0566acdde0c3f02108468b599ac47cbef6e not found: ID does not exist" containerID="06f869012d5567a5a401e24fbcbde0566acdde0c3f02108468b599ac47cbef6e" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.575028 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f869012d5567a5a401e24fbcbde0566acdde0c3f02108468b599ac47cbef6e"} err="failed to get container status \"06f869012d5567a5a401e24fbcbde0566acdde0c3f02108468b599ac47cbef6e\": rpc error: code = NotFound desc = could not find container \"06f869012d5567a5a401e24fbcbde0566acdde0c3f02108468b599ac47cbef6e\": container with ID starting with 06f869012d5567a5a401e24fbcbde0566acdde0c3f02108468b599ac47cbef6e not found: ID does not exist" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.575056 4964 scope.go:117] "RemoveContainer" containerID="82a86bdc0242e353cb68ff7cf6067f528b9b3597881272926d112bc013bff8f9" Oct 04 04:04:11 crc kubenswrapper[4964]: E1004 04:04:11.575309 4964 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82a86bdc0242e353cb68ff7cf6067f528b9b3597881272926d112bc013bff8f9\": container with ID starting with 82a86bdc0242e353cb68ff7cf6067f528b9b3597881272926d112bc013bff8f9 not found: ID does not exist" containerID="82a86bdc0242e353cb68ff7cf6067f528b9b3597881272926d112bc013bff8f9" Oct 04 04:04:11 crc kubenswrapper[4964]: I1004 04:04:11.575343 4964 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82a86bdc0242e353cb68ff7cf6067f528b9b3597881272926d112bc013bff8f9"} err="failed to get container status \"82a86bdc0242e353cb68ff7cf6067f528b9b3597881272926d112bc013bff8f9\": rpc error: code = NotFound desc = could not find container \"82a86bdc0242e353cb68ff7cf6067f528b9b3597881272926d112bc013bff8f9\": container with ID starting with 82a86bdc0242e353cb68ff7cf6067f528b9b3597881272926d112bc013bff8f9 not found: ID does not exist" Oct 04 04:04:12 crc kubenswrapper[4964]: I1004 04:04:12.862421 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed8a47d5-0d0c-41c4-bddb-39686a42483c" path="/var/lib/kubelet/pods/ed8a47d5-0d0c-41c4-bddb-39686a42483c/volumes" Oct 04 04:04:13 crc kubenswrapper[4964]: I1004 04:04:13.832431 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kfhbd"] Oct 04 04:04:13 crc kubenswrapper[4964]: I1004 04:04:13.832968 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kfhbd" podUID="91992027-36e9-4b42-b5a2-c1291be864cb" containerName="registry-server" containerID="cri-o://d8a2cf7e5db0528d46a6140cf71d7c5aa351f24d1e7920b058a1be1611484f16" gracePeriod=2 Oct 04 04:04:14 crc kubenswrapper[4964]: I1004 04:04:14.496751 4964 generic.go:334] "Generic (PLEG): container finished" podID="91992027-36e9-4b42-b5a2-c1291be864cb" containerID="d8a2cf7e5db0528d46a6140cf71d7c5aa351f24d1e7920b058a1be1611484f16" exitCode=0 Oct 04 04:04:14 crc kubenswrapper[4964]: I1004 04:04:14.496841 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfhbd" event={"ID":"91992027-36e9-4b42-b5a2-c1291be864cb","Type":"ContainerDied","Data":"d8a2cf7e5db0528d46a6140cf71d7c5aa351f24d1e7920b058a1be1611484f16"} Oct 04 04:04:14 crc kubenswrapper[4964]: I1004 04:04:14.803090 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:14 crc kubenswrapper[4964]: I1004 04:04:14.965549 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91992027-36e9-4b42-b5a2-c1291be864cb-utilities\") pod \"91992027-36e9-4b42-b5a2-c1291be864cb\" (UID: \"91992027-36e9-4b42-b5a2-c1291be864cb\") " Oct 04 04:04:14 crc kubenswrapper[4964]: I1004 04:04:14.965875 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91992027-36e9-4b42-b5a2-c1291be864cb-catalog-content\") pod \"91992027-36e9-4b42-b5a2-c1291be864cb\" (UID: \"91992027-36e9-4b42-b5a2-c1291be864cb\") " Oct 04 04:04:14 crc kubenswrapper[4964]: I1004 04:04:14.966053 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv282\" (UniqueName: \"kubernetes.io/projected/91992027-36e9-4b42-b5a2-c1291be864cb-kube-api-access-zv282\") pod \"91992027-36e9-4b42-b5a2-c1291be864cb\" (UID: \"91992027-36e9-4b42-b5a2-c1291be864cb\") " Oct 04 04:04:14 crc kubenswrapper[4964]: I1004 04:04:14.967201 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91992027-36e9-4b42-b5a2-c1291be864cb-utilities" (OuterVolumeSpecName: "utilities") pod "91992027-36e9-4b42-b5a2-c1291be864cb" (UID: "91992027-36e9-4b42-b5a2-c1291be864cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:04:14 crc kubenswrapper[4964]: I1004 04:04:14.967989 4964 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91992027-36e9-4b42-b5a2-c1291be864cb-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:04:14 crc kubenswrapper[4964]: I1004 04:04:14.972380 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91992027-36e9-4b42-b5a2-c1291be864cb-kube-api-access-zv282" (OuterVolumeSpecName: "kube-api-access-zv282") pod "91992027-36e9-4b42-b5a2-c1291be864cb" (UID: "91992027-36e9-4b42-b5a2-c1291be864cb"). InnerVolumeSpecName "kube-api-access-zv282". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:04:15 crc kubenswrapper[4964]: I1004 04:04:15.014554 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91992027-36e9-4b42-b5a2-c1291be864cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91992027-36e9-4b42-b5a2-c1291be864cb" (UID: "91992027-36e9-4b42-b5a2-c1291be864cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:04:15 crc kubenswrapper[4964]: I1004 04:04:15.070819 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv282\" (UniqueName: \"kubernetes.io/projected/91992027-36e9-4b42-b5a2-c1291be864cb-kube-api-access-zv282\") on node \"crc\" DevicePath \"\"" Oct 04 04:04:15 crc kubenswrapper[4964]: I1004 04:04:15.070899 4964 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91992027-36e9-4b42-b5a2-c1291be864cb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:04:15 crc kubenswrapper[4964]: I1004 04:04:15.523770 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfhbd" event={"ID":"91992027-36e9-4b42-b5a2-c1291be864cb","Type":"ContainerDied","Data":"91d80a388f47baac12d44866d26c838da6f3ae057a6312cbc32018c9a6651aeb"} Oct 04 04:04:15 crc kubenswrapper[4964]: I1004 04:04:15.524139 4964 scope.go:117] "RemoveContainer" containerID="d8a2cf7e5db0528d46a6140cf71d7c5aa351f24d1e7920b058a1be1611484f16" Oct 04 04:04:15 crc kubenswrapper[4964]: I1004 04:04:15.523880 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfhbd" Oct 04 04:04:15 crc kubenswrapper[4964]: I1004 04:04:15.559976 4964 scope.go:117] "RemoveContainer" containerID="e4c4ff1e3839940065535ae2ad70e8293707b5cc927e9065c8c0311175021c9b" Oct 04 04:04:15 crc kubenswrapper[4964]: I1004 04:04:15.565006 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kfhbd"] Oct 04 04:04:15 crc kubenswrapper[4964]: I1004 04:04:15.573910 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kfhbd"] Oct 04 04:04:15 crc kubenswrapper[4964]: I1004 04:04:15.576685 4964 scope.go:117] "RemoveContainer" containerID="bc55b671996d18070a90ca882b94f4b2fee77c78222a2d77508e4311f26a2fd8" Oct 04 04:04:16 crc kubenswrapper[4964]: I1004 04:04:16.856642 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91992027-36e9-4b42-b5a2-c1291be864cb" path="/var/lib/kubelet/pods/91992027-36e9-4b42-b5a2-c1291be864cb/volumes" Oct 04 04:04:21 crc kubenswrapper[4964]: I1004 04:04:21.263290 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-lkldd_d09e8523-afbc-4f5d-888d-92b350c15f7c/control-plane-machine-set-operator/0.log" Oct 04 04:04:21 crc kubenswrapper[4964]: I1004 04:04:21.439685 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b4x88_cc0ac95b-a7a9-4b23-a073-99146acc645d/machine-api-operator/0.log" Oct 04 04:04:21 crc kubenswrapper[4964]: I1004 04:04:21.447842 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b4x88_cc0ac95b-a7a9-4b23-a073-99146acc645d/kube-rbac-proxy/0.log" Oct 04 04:04:23 crc kubenswrapper[4964]: I1004 04:04:23.845299 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:04:23 crc kubenswrapper[4964]: E1004 04:04:23.846807 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:04:35 crc kubenswrapper[4964]: I1004 04:04:35.200597 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-vkqh4_94a658f9-597a-4377-a7a2-b5b46e1fe345/cert-manager-controller/0.log" Oct 04 04:04:35 crc kubenswrapper[4964]: I1004 04:04:35.335745 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-zlb46_454987ea-9834-46bc-b79a-ba124a2a44ed/cert-manager-cainjector/0.log" Oct 04 04:04:35 crc kubenswrapper[4964]: I1004 04:04:35.365365 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-dksdz_921b427a-cfcd-4125-aab7-e1c073058743/cert-manager-webhook/0.log" Oct 04 04:04:35 crc kubenswrapper[4964]: I1004 04:04:35.845599 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:04:35 crc kubenswrapper[4964]: E1004 04:04:35.845963 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:04:46 crc kubenswrapper[4964]: I1004 04:04:46.846068 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:04:46 crc kubenswrapper[4964]: E1004 04:04:46.848117 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:04:48 crc kubenswrapper[4964]: I1004 04:04:48.989827 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-z47kl_3c056ed4-2fd6-42dd-8702-a84d27d26fd2/nmstate-console-plugin/0.log" Oct 04 04:04:49 crc kubenswrapper[4964]: I1004 04:04:49.132862 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-9z4n9_c17a3833-2ef7-4e9b-a1b0-b065ead5133f/nmstate-handler/0.log" Oct 04 04:04:49 crc kubenswrapper[4964]: I1004 04:04:49.174358 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-s5r9n_ffcb8f9b-4f68-48ed-a155-99a05b8f508b/kube-rbac-proxy/0.log" Oct 04 04:04:49 crc kubenswrapper[4964]: I1004 04:04:49.184165 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-s5r9n_ffcb8f9b-4f68-48ed-a155-99a05b8f508b/nmstate-metrics/0.log" Oct 04 04:04:49 crc kubenswrapper[4964]: I1004 04:04:49.371586 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-clc2b_db3cd097-a1d9-40ac-af2c-e5d35c8fcd95/nmstate-webhook/0.log" Oct 04 04:04:49 crc kubenswrapper[4964]: I1004 04:04:49.376265 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-c2nvh_9bf5755a-6882-4f0e-9146-0d925ad5ccc5/nmstate-operator/0.log" Oct 04 04:04:59 crc kubenswrapper[4964]: I1004 04:04:59.845404 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:04:59 crc kubenswrapper[4964]: E1004 04:04:59.846209 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:05:03 crc kubenswrapper[4964]: I1004 04:05:03.716171 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-tvxzc_e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc/kube-rbac-proxy/0.log" Oct 04 04:05:03 crc kubenswrapper[4964]: I1004 04:05:03.845197 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-tvxzc_e4ac4d52-fb50-4d4a-9e5b-b8a0aed2c6dc/controller/0.log" Oct 04 04:05:03 crc kubenswrapper[4964]: I1004 04:05:03.890784 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/cp-frr-files/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.110787 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/cp-reloader/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.125049 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/cp-frr-files/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.134371 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/cp-metrics/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.182377 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/cp-reloader/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.269276 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/cp-frr-files/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.325251 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/cp-metrics/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.327598 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/cp-reloader/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.395308 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/cp-metrics/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.511363 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/cp-frr-files/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.539344 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/cp-reloader/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.541914 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/cp-metrics/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.574337 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/controller/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.736907 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/kube-rbac-proxy/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.740594 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/frr-metrics/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.763749 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/kube-rbac-proxy-frr/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.937239 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/reloader/0.log" Oct 04 04:05:04 crc kubenswrapper[4964]: I1004 04:05:04.997376 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-6xh6v_ae35709d-6c76-49dd-a685-664e41a117ba/frr-k8s-webhook-server/0.log" Oct 04 04:05:05 crc kubenswrapper[4964]: I1004 04:05:05.159721 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85cc4db5cb-nh55m_84f70c83-2c96-4e7c-99c0-4322d4b97f04/manager/0.log" Oct 04 04:05:05 crc kubenswrapper[4964]: I1004 04:05:05.358976 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66d576bc7b-2j5gm_f95fd202-cbdb-4a99-ac3e-0182838a3c96/webhook-server/0.log" Oct 04 04:05:05 crc kubenswrapper[4964]: I1004 04:05:05.371283 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-s6v9p_352336ed-63d9-4f9e-86e0-e25db230594a/kube-rbac-proxy/0.log" Oct 04 04:05:05 crc kubenswrapper[4964]: I1004 04:05:05.950852 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-s6v9p_352336ed-63d9-4f9e-86e0-e25db230594a/speaker/0.log" Oct 04 04:05:06 crc kubenswrapper[4964]: I1004 04:05:06.406586 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fs4w8_7e06f03f-a746-4f64-a49c-16bc836bc682/frr/0.log" Oct 04 04:05:11 crc kubenswrapper[4964]: I1004 04:05:11.846509 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:05:11 crc kubenswrapper[4964]: E1004 04:05:11.847853 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:05:20 crc kubenswrapper[4964]: I1004 04:05:20.489978 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr_eee5f115-fb35-42ad-b604-33c3fc0f4d35/util/0.log" Oct 04 04:05:21 crc kubenswrapper[4964]: I1004 04:05:21.228117 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr_eee5f115-fb35-42ad-b604-33c3fc0f4d35/pull/0.log" Oct 04 04:05:21 crc kubenswrapper[4964]: I1004 04:05:21.228604 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr_eee5f115-fb35-42ad-b604-33c3fc0f4d35/util/0.log" Oct 04 04:05:21 crc kubenswrapper[4964]: I1004 04:05:21.261754 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr_eee5f115-fb35-42ad-b604-33c3fc0f4d35/pull/0.log" Oct 04 04:05:21 crc kubenswrapper[4964]: I1004 04:05:21.380902 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr_eee5f115-fb35-42ad-b604-33c3fc0f4d35/util/0.log" Oct 04 04:05:21 crc kubenswrapper[4964]: I1004 04:05:21.390070 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr_eee5f115-fb35-42ad-b604-33c3fc0f4d35/pull/0.log" Oct 04 04:05:21 crc kubenswrapper[4964]: I1004 04:05:21.407132 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zp2zr_eee5f115-fb35-42ad-b604-33c3fc0f4d35/extract/0.log" Oct 04 04:05:21 crc kubenswrapper[4964]: I1004 04:05:21.569765 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7b6j9_63e7ee31-bc4a-4f08-bdec-51eb8f16be69/extract-utilities/0.log" Oct 04 04:05:21 crc kubenswrapper[4964]: I1004 04:05:21.740336 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7b6j9_63e7ee31-bc4a-4f08-bdec-51eb8f16be69/extract-content/0.log" Oct 04 04:05:21 crc kubenswrapper[4964]: I1004 04:05:21.752133 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7b6j9_63e7ee31-bc4a-4f08-bdec-51eb8f16be69/extract-content/0.log" Oct 04 04:05:21 crc kubenswrapper[4964]: I1004 04:05:21.758765 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7b6j9_63e7ee31-bc4a-4f08-bdec-51eb8f16be69/extract-utilities/0.log" Oct 04 04:05:21 crc kubenswrapper[4964]: I1004 04:05:21.915687 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7b6j9_63e7ee31-bc4a-4f08-bdec-51eb8f16be69/extract-content/0.log" Oct 04 04:05:21 crc kubenswrapper[4964]: I1004 04:05:21.945149 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7b6j9_63e7ee31-bc4a-4f08-bdec-51eb8f16be69/extract-utilities/0.log" Oct 04 04:05:22 crc kubenswrapper[4964]: I1004 04:05:22.122933 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vvnjd_bcda0354-8784-4a0e-86bc-b06464d136d9/extract-utilities/0.log" Oct 04 04:05:22 crc kubenswrapper[4964]: I1004 04:05:22.328605 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vvnjd_bcda0354-8784-4a0e-86bc-b06464d136d9/extract-content/0.log" Oct 04 04:05:22 crc kubenswrapper[4964]: I1004 04:05:22.391417 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vvnjd_bcda0354-8784-4a0e-86bc-b06464d136d9/extract-content/0.log" Oct 04 04:05:22 crc kubenswrapper[4964]: I1004 04:05:22.394116 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vvnjd_bcda0354-8784-4a0e-86bc-b06464d136d9/extract-utilities/0.log" Oct 04 04:05:22 crc kubenswrapper[4964]: I1004 04:05:22.546289 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7b6j9_63e7ee31-bc4a-4f08-bdec-51eb8f16be69/registry-server/0.log" Oct 04 04:05:22 crc kubenswrapper[4964]: I1004 04:05:22.553215 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vvnjd_bcda0354-8784-4a0e-86bc-b06464d136d9/extract-utilities/0.log" Oct 04 04:05:22 crc kubenswrapper[4964]: I1004 04:05:22.570220 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vvnjd_bcda0354-8784-4a0e-86bc-b06464d136d9/extract-content/0.log" Oct 04 04:05:22 crc kubenswrapper[4964]: I1004 04:05:22.795006 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6_7dcb37ec-6e78-4a26-897c-96c10ee42aba/util/0.log" Oct 04 04:05:22 crc kubenswrapper[4964]: I1004 04:05:22.844820 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:05:22 crc kubenswrapper[4964]: E1004 04:05:22.845121 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.007154 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6_7dcb37ec-6e78-4a26-897c-96c10ee42aba/pull/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.062749 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6_7dcb37ec-6e78-4a26-897c-96c10ee42aba/pull/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.100913 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6_7dcb37ec-6e78-4a26-897c-96c10ee42aba/util/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.270325 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6_7dcb37ec-6e78-4a26-897c-96c10ee42aba/pull/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.333008 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vvnjd_bcda0354-8784-4a0e-86bc-b06464d136d9/registry-server/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.359110 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6_7dcb37ec-6e78-4a26-897c-96c10ee42aba/util/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.375703 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ckt8q6_7dcb37ec-6e78-4a26-897c-96c10ee42aba/extract/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.509055 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kqprv_0da7f331-58e4-48cf-abb4-a8c8fd7b137b/marketplace-operator/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.558168 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqv28_9e9d5421-37a5-4691-be42-0d69ce5c9150/extract-utilities/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.716764 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqv28_9e9d5421-37a5-4691-be42-0d69ce5c9150/extract-utilities/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.718093 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqv28_9e9d5421-37a5-4691-be42-0d69ce5c9150/extract-content/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.734086 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqv28_9e9d5421-37a5-4691-be42-0d69ce5c9150/extract-content/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.849187 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqv28_9e9d5421-37a5-4691-be42-0d69ce5c9150/extract-utilities/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.849262 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqv28_9e9d5421-37a5-4691-be42-0d69ce5c9150/extract-content/0.log" Oct 04 04:05:23 crc kubenswrapper[4964]: I1004 04:05:23.902802 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rphwc_1fa77cb7-85eb-4961-b663-dc464f81426b/extract-utilities/0.log" Oct 04 04:05:24 crc kubenswrapper[4964]: I1004 04:05:24.039136 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqv28_9e9d5421-37a5-4691-be42-0d69ce5c9150/registry-server/0.log" Oct 04 04:05:24 crc kubenswrapper[4964]: I1004 04:05:24.145426 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rphwc_1fa77cb7-85eb-4961-b663-dc464f81426b/extract-utilities/0.log" Oct 04 04:05:24 crc kubenswrapper[4964]: I1004 04:05:24.151552 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rphwc_1fa77cb7-85eb-4961-b663-dc464f81426b/extract-content/0.log" Oct 04 04:05:24 crc kubenswrapper[4964]: I1004 04:05:24.160678 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rphwc_1fa77cb7-85eb-4961-b663-dc464f81426b/extract-content/0.log" Oct 04 04:05:24 crc kubenswrapper[4964]: I1004 04:05:24.335703 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rphwc_1fa77cb7-85eb-4961-b663-dc464f81426b/extract-content/0.log" Oct 04 04:05:24 crc kubenswrapper[4964]: I1004 04:05:24.337115 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rphwc_1fa77cb7-85eb-4961-b663-dc464f81426b/extract-utilities/0.log" Oct 04 04:05:24 crc kubenswrapper[4964]: I1004 04:05:24.855732 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rphwc_1fa77cb7-85eb-4961-b663-dc464f81426b/registry-server/0.log" Oct 04 04:05:33 crc kubenswrapper[4964]: I1004 04:05:33.845358 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:05:33 crc kubenswrapper[4964]: E1004 04:05:33.846168 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:05:44 crc kubenswrapper[4964]: I1004 04:05:44.845593 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:05:44 crc kubenswrapper[4964]: E1004 04:05:44.846510 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:05:55 crc kubenswrapper[4964]: I1004 04:05:55.846140 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:05:55 crc kubenswrapper[4964]: E1004 04:05:55.846919 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:06:10 crc kubenswrapper[4964]: I1004 04:06:10.858336 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:06:10 crc kubenswrapper[4964]: E1004 04:06:10.859507 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:06:22 crc kubenswrapper[4964]: I1004 04:06:22.855189 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:06:22 crc kubenswrapper[4964]: E1004 04:06:22.856308 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:06:34 crc kubenswrapper[4964]: I1004 04:06:34.845879 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:06:34 crc kubenswrapper[4964]: E1004 04:06:34.846529 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:06:48 crc kubenswrapper[4964]: I1004 04:06:48.848937 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:06:48 crc kubenswrapper[4964]: E1004 04:06:48.850107 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:07:01 crc kubenswrapper[4964]: I1004 04:07:01.846724 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:07:01 crc kubenswrapper[4964]: E1004 04:07:01.848240 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:07:13 crc kubenswrapper[4964]: I1004 04:07:13.845912 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:07:13 crc kubenswrapper[4964]: E1004 04:07:13.847163 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:07:27 crc kubenswrapper[4964]: I1004 04:07:27.846336 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:07:27 crc kubenswrapper[4964]: E1004 04:07:27.847342 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:07:41 crc kubenswrapper[4964]: I1004 04:07:41.845550 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:07:41 crc kubenswrapper[4964]: E1004 04:07:41.846261 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:07:42 crc kubenswrapper[4964]: I1004 04:07:42.792868 4964 generic.go:334] "Generic (PLEG): container finished" podID="a215b0cf-24f2-4e4e-a1df-f6d947c27301" containerID="14f54e6c290ee526a013f17d64cdd8d74cecaf549a0008fab11f0d28a25f3399" exitCode=0 Oct 04 04:07:42 crc kubenswrapper[4964]: I1004 04:07:42.792984 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzs6p/must-gather-kzb6l" event={"ID":"a215b0cf-24f2-4e4e-a1df-f6d947c27301","Type":"ContainerDied","Data":"14f54e6c290ee526a013f17d64cdd8d74cecaf549a0008fab11f0d28a25f3399"} Oct 04 04:07:42 crc kubenswrapper[4964]: I1004 04:07:42.794288 4964 scope.go:117] "RemoveContainer" containerID="14f54e6c290ee526a013f17d64cdd8d74cecaf549a0008fab11f0d28a25f3399" Oct 04 04:07:42 crc kubenswrapper[4964]: I1004 04:07:42.892367 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vzs6p_must-gather-kzb6l_a215b0cf-24f2-4e4e-a1df-f6d947c27301/gather/0.log" Oct 04 04:07:50 crc kubenswrapper[4964]: I1004 04:07:50.607675 4964 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vzs6p/must-gather-kzb6l"] Oct 04 04:07:50 crc kubenswrapper[4964]: I1004 04:07:50.610265 4964 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vzs6p/must-gather-kzb6l" podUID="a215b0cf-24f2-4e4e-a1df-f6d947c27301" containerName="copy" containerID="cri-o://6c893e9d1772d9f3ae5a3c078896e9cde869f4c34e7e2749297612ebb43c795f" gracePeriod=2 Oct 04 04:07:50 crc kubenswrapper[4964]: I1004 04:07:50.621205 4964 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vzs6p/must-gather-kzb6l"] Oct 04 04:07:50 crc kubenswrapper[4964]: I1004 04:07:50.873133 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vzs6p_must-gather-kzb6l_a215b0cf-24f2-4e4e-a1df-f6d947c27301/copy/0.log" Oct 04 04:07:50 crc kubenswrapper[4964]: I1004 04:07:50.873709 4964 generic.go:334] "Generic (PLEG): container finished" podID="a215b0cf-24f2-4e4e-a1df-f6d947c27301" containerID="6c893e9d1772d9f3ae5a3c078896e9cde869f4c34e7e2749297612ebb43c795f" exitCode=143 Oct 04 04:07:51 crc kubenswrapper[4964]: I1004 04:07:51.329571 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vzs6p_must-gather-kzb6l_a215b0cf-24f2-4e4e-a1df-f6d947c27301/copy/0.log" Oct 04 04:07:51 crc kubenswrapper[4964]: I1004 04:07:51.329992 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/must-gather-kzb6l" Oct 04 04:07:51 crc kubenswrapper[4964]: I1004 04:07:51.472902 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhm55\" (UniqueName: \"kubernetes.io/projected/a215b0cf-24f2-4e4e-a1df-f6d947c27301-kube-api-access-jhm55\") pod \"a215b0cf-24f2-4e4e-a1df-f6d947c27301\" (UID: \"a215b0cf-24f2-4e4e-a1df-f6d947c27301\") " Oct 04 04:07:51 crc kubenswrapper[4964]: I1004 04:07:51.473395 4964 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a215b0cf-24f2-4e4e-a1df-f6d947c27301-must-gather-output\") pod \"a215b0cf-24f2-4e4e-a1df-f6d947c27301\" (UID: \"a215b0cf-24f2-4e4e-a1df-f6d947c27301\") " Oct 04 04:07:51 crc kubenswrapper[4964]: I1004 04:07:51.482642 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a215b0cf-24f2-4e4e-a1df-f6d947c27301-kube-api-access-jhm55" (OuterVolumeSpecName: "kube-api-access-jhm55") pod "a215b0cf-24f2-4e4e-a1df-f6d947c27301" (UID: "a215b0cf-24f2-4e4e-a1df-f6d947c27301"). InnerVolumeSpecName "kube-api-access-jhm55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:07:51 crc kubenswrapper[4964]: I1004 04:07:51.576848 4964 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhm55\" (UniqueName: \"kubernetes.io/projected/a215b0cf-24f2-4e4e-a1df-f6d947c27301-kube-api-access-jhm55\") on node \"crc\" DevicePath \"\"" Oct 04 04:07:51 crc kubenswrapper[4964]: I1004 04:07:51.685339 4964 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a215b0cf-24f2-4e4e-a1df-f6d947c27301-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a215b0cf-24f2-4e4e-a1df-f6d947c27301" (UID: "a215b0cf-24f2-4e4e-a1df-f6d947c27301"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:07:51 crc kubenswrapper[4964]: I1004 04:07:51.780773 4964 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a215b0cf-24f2-4e4e-a1df-f6d947c27301-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 04 04:07:51 crc kubenswrapper[4964]: I1004 04:07:51.883509 4964 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vzs6p_must-gather-kzb6l_a215b0cf-24f2-4e4e-a1df-f6d947c27301/copy/0.log" Oct 04 04:07:51 crc kubenswrapper[4964]: I1004 04:07:51.883945 4964 scope.go:117] "RemoveContainer" containerID="6c893e9d1772d9f3ae5a3c078896e9cde869f4c34e7e2749297612ebb43c795f" Oct 04 04:07:51 crc kubenswrapper[4964]: I1004 04:07:51.884021 4964 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzs6p/must-gather-kzb6l" Oct 04 04:07:51 crc kubenswrapper[4964]: I1004 04:07:51.933850 4964 scope.go:117] "RemoveContainer" containerID="14f54e6c290ee526a013f17d64cdd8d74cecaf549a0008fab11f0d28a25f3399" Oct 04 04:07:52 crc kubenswrapper[4964]: I1004 04:07:52.864145 4964 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a215b0cf-24f2-4e4e-a1df-f6d947c27301" path="/var/lib/kubelet/pods/a215b0cf-24f2-4e4e-a1df-f6d947c27301/volumes" Oct 04 04:07:54 crc kubenswrapper[4964]: I1004 04:07:54.845497 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:07:54 crc kubenswrapper[4964]: E1004 04:07:54.846174 4964 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m7mv7_openshift-machine-config-operator(95c02c3c-a484-46f9-a96d-8650b8f9c67f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" Oct 04 04:08:05 crc kubenswrapper[4964]: I1004 04:08:05.845185 4964 scope.go:117] "RemoveContainer" containerID="83dda035c94da71f32d8725579533e07687da8383ccf38a980fee18723d6c09c" Oct 04 04:08:07 crc kubenswrapper[4964]: I1004 04:08:07.037528 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" event={"ID":"95c02c3c-a484-46f9-a96d-8650b8f9c67f","Type":"ContainerStarted","Data":"fa4b18c26588b3f115e9e899f056907f865f7b7e1c1d74687dbc5a1f1d9681cf"} Oct 04 04:08:31 crc kubenswrapper[4964]: I1004 04:08:31.210729 4964 scope.go:117] "RemoveContainer" containerID="3e7f31c851dd6bab9959f23ad41bed4f4d4148bfa5ec5f7e4c7f04f461bb5987" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.356343 4964 scope.go:117] "RemoveContainer" containerID="c13304d3f0ebb4ef43972e37ab4b74b610072f943070326424412e6ec19ea4a0" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.620927 4964 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bdtwr"] Oct 04 04:10:31 crc kubenswrapper[4964]: E1004 04:10:31.621482 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a215b0cf-24f2-4e4e-a1df-f6d947c27301" containerName="copy" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.621509 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a215b0cf-24f2-4e4e-a1df-f6d947c27301" containerName="copy" Oct 04 04:10:31 crc kubenswrapper[4964]: E1004 04:10:31.621541 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91992027-36e9-4b42-b5a2-c1291be864cb" containerName="registry-server" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.621552 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="91992027-36e9-4b42-b5a2-c1291be864cb" containerName="registry-server" Oct 04 04:10:31 crc kubenswrapper[4964]: E1004 04:10:31.621586 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8a47d5-0d0c-41c4-bddb-39686a42483c" containerName="extract-content" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.621597 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8a47d5-0d0c-41c4-bddb-39686a42483c" containerName="extract-content" Oct 04 04:10:31 crc kubenswrapper[4964]: E1004 04:10:31.621645 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8a47d5-0d0c-41c4-bddb-39686a42483c" containerName="registry-server" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.621655 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8a47d5-0d0c-41c4-bddb-39686a42483c" containerName="registry-server" Oct 04 04:10:31 crc kubenswrapper[4964]: E1004 04:10:31.621678 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a215b0cf-24f2-4e4e-a1df-f6d947c27301" containerName="gather" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.621688 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="a215b0cf-24f2-4e4e-a1df-f6d947c27301" containerName="gather" Oct 04 04:10:31 crc kubenswrapper[4964]: E1004 04:10:31.621706 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91992027-36e9-4b42-b5a2-c1291be864cb" containerName="extract-utilities" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.621717 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="91992027-36e9-4b42-b5a2-c1291be864cb" containerName="extract-utilities" Oct 04 04:10:31 crc kubenswrapper[4964]: E1004 04:10:31.621739 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8a47d5-0d0c-41c4-bddb-39686a42483c" containerName="extract-utilities" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.621748 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8a47d5-0d0c-41c4-bddb-39686a42483c" containerName="extract-utilities" Oct 04 04:10:31 crc kubenswrapper[4964]: E1004 04:10:31.621766 4964 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91992027-36e9-4b42-b5a2-c1291be864cb" containerName="extract-content" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.621776 4964 state_mem.go:107] "Deleted CPUSet assignment" podUID="91992027-36e9-4b42-b5a2-c1291be864cb" containerName="extract-content" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.622121 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8a47d5-0d0c-41c4-bddb-39686a42483c" containerName="registry-server" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.622151 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="a215b0cf-24f2-4e4e-a1df-f6d947c27301" containerName="gather" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.622177 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="a215b0cf-24f2-4e4e-a1df-f6d947c27301" containerName="copy" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.622205 4964 memory_manager.go:354] "RemoveStaleState removing state" podUID="91992027-36e9-4b42-b5a2-c1291be864cb" containerName="registry-server" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.624944 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdtwr" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.631990 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdtwr"] Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.707846 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/511ff5ee-7f03-46ec-8acc-6ee4560fbab5-utilities\") pod \"redhat-marketplace-bdtwr\" (UID: \"511ff5ee-7f03-46ec-8acc-6ee4560fbab5\") " pod="openshift-marketplace/redhat-marketplace-bdtwr" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.707904 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm8cd\" (UniqueName: \"kubernetes.io/projected/511ff5ee-7f03-46ec-8acc-6ee4560fbab5-kube-api-access-jm8cd\") pod \"redhat-marketplace-bdtwr\" (UID: \"511ff5ee-7f03-46ec-8acc-6ee4560fbab5\") " pod="openshift-marketplace/redhat-marketplace-bdtwr" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.708098 4964 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/511ff5ee-7f03-46ec-8acc-6ee4560fbab5-catalog-content\") pod \"redhat-marketplace-bdtwr\" (UID: \"511ff5ee-7f03-46ec-8acc-6ee4560fbab5\") " pod="openshift-marketplace/redhat-marketplace-bdtwr" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.809684 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/511ff5ee-7f03-46ec-8acc-6ee4560fbab5-utilities\") pod \"redhat-marketplace-bdtwr\" (UID: \"511ff5ee-7f03-46ec-8acc-6ee4560fbab5\") " pod="openshift-marketplace/redhat-marketplace-bdtwr" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.809734 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm8cd\" (UniqueName: \"kubernetes.io/projected/511ff5ee-7f03-46ec-8acc-6ee4560fbab5-kube-api-access-jm8cd\") pod \"redhat-marketplace-bdtwr\" (UID: \"511ff5ee-7f03-46ec-8acc-6ee4560fbab5\") " pod="openshift-marketplace/redhat-marketplace-bdtwr" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.809789 4964 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/511ff5ee-7f03-46ec-8acc-6ee4560fbab5-catalog-content\") pod \"redhat-marketplace-bdtwr\" (UID: \"511ff5ee-7f03-46ec-8acc-6ee4560fbab5\") " pod="openshift-marketplace/redhat-marketplace-bdtwr" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.810408 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/511ff5ee-7f03-46ec-8acc-6ee4560fbab5-catalog-content\") pod \"redhat-marketplace-bdtwr\" (UID: \"511ff5ee-7f03-46ec-8acc-6ee4560fbab5\") " pod="openshift-marketplace/redhat-marketplace-bdtwr" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.810435 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/511ff5ee-7f03-46ec-8acc-6ee4560fbab5-utilities\") pod \"redhat-marketplace-bdtwr\" (UID: \"511ff5ee-7f03-46ec-8acc-6ee4560fbab5\") " pod="openshift-marketplace/redhat-marketplace-bdtwr" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.833069 4964 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm8cd\" (UniqueName: \"kubernetes.io/projected/511ff5ee-7f03-46ec-8acc-6ee4560fbab5-kube-api-access-jm8cd\") pod \"redhat-marketplace-bdtwr\" (UID: \"511ff5ee-7f03-46ec-8acc-6ee4560fbab5\") " pod="openshift-marketplace/redhat-marketplace-bdtwr" Oct 04 04:10:31 crc kubenswrapper[4964]: I1004 04:10:31.974472 4964 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdtwr" Oct 04 04:10:32 crc kubenswrapper[4964]: I1004 04:10:32.437417 4964 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdtwr"] Oct 04 04:10:33 crc kubenswrapper[4964]: I1004 04:10:33.718593 4964 generic.go:334] "Generic (PLEG): container finished" podID="511ff5ee-7f03-46ec-8acc-6ee4560fbab5" containerID="39a6dd010cedfb5e541e903ff842c906b042f3e7ab8a8f2b1b30236206b7eab1" exitCode=0 Oct 04 04:10:33 crc kubenswrapper[4964]: I1004 04:10:33.718688 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdtwr" event={"ID":"511ff5ee-7f03-46ec-8acc-6ee4560fbab5","Type":"ContainerDied","Data":"39a6dd010cedfb5e541e903ff842c906b042f3e7ab8a8f2b1b30236206b7eab1"} Oct 04 04:10:33 crc kubenswrapper[4964]: I1004 04:10:33.719803 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdtwr" event={"ID":"511ff5ee-7f03-46ec-8acc-6ee4560fbab5","Type":"ContainerStarted","Data":"66f3111b9e33d9bec80a96604833eabc532b63e31b03e6c391e240da62750381"} Oct 04 04:10:33 crc kubenswrapper[4964]: I1004 04:10:33.723577 4964 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 04:10:34 crc kubenswrapper[4964]: I1004 04:10:34.448960 4964 patch_prober.go:28] interesting pod/machine-config-daemon-m7mv7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:10:34 crc kubenswrapper[4964]: I1004 04:10:34.449300 4964 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m7mv7" podUID="95c02c3c-a484-46f9-a96d-8650b8f9c67f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:10:35 crc kubenswrapper[4964]: I1004 04:10:35.738327 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdtwr" event={"ID":"511ff5ee-7f03-46ec-8acc-6ee4560fbab5","Type":"ContainerStarted","Data":"eab02efe383f96b4b9635f7d3fc55a059d4dda4f32f45d18bd99610d29792c4a"} Oct 04 04:10:36 crc kubenswrapper[4964]: I1004 04:10:36.752834 4964 generic.go:334] "Generic (PLEG): container finished" podID="511ff5ee-7f03-46ec-8acc-6ee4560fbab5" containerID="eab02efe383f96b4b9635f7d3fc55a059d4dda4f32f45d18bd99610d29792c4a" exitCode=0 Oct 04 04:10:36 crc kubenswrapper[4964]: I1004 04:10:36.752890 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdtwr" event={"ID":"511ff5ee-7f03-46ec-8acc-6ee4560fbab5","Type":"ContainerDied","Data":"eab02efe383f96b4b9635f7d3fc55a059d4dda4f32f45d18bd99610d29792c4a"} Oct 04 04:10:37 crc kubenswrapper[4964]: I1004 04:10:37.769559 4964 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdtwr" event={"ID":"511ff5ee-7f03-46ec-8acc-6ee4560fbab5","Type":"ContainerStarted","Data":"d09994bfb93689f6614e3fa3943582362fc13b79bc61e1fa33fbe7ee54745a8f"} Oct 04 04:10:37 crc kubenswrapper[4964]: I1004 04:10:37.794671 4964 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bdtwr" podStartSLOduration=3.341295424 podStartE2EDuration="6.794646013s" podCreationTimestamp="2025-10-04 04:10:31 +0000 UTC" firstStartedPulling="2025-10-04 04:10:33.72301779 +0000 UTC m=+5413.619976458" lastFinishedPulling="2025-10-04 04:10:37.176368379 +0000 UTC m=+5417.073327047" observedRunningTime="2025-10-04 04:10:37.794496609 +0000 UTC m=+5417.691455317" watchObservedRunningTime="2025-10-04 04:10:37.794646013 +0000 UTC m=+5417.691604691"